0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

ChatGPTが作図した深層学習の仕組み

Posted at

ChatGPTに、深層学習について解説させて、Mermaid図を作らせてみました。

Mermaidは次の通り:

graph LR;
    Input[入力] --> InputLayer
    
    %% 入力層 - シンプルに整理
    subgraph InputLayer[入力層]
        A1((入力))
        A2((入力))
        A3((入力))
    end
    
    %% 中間層 - 結合線をまとめ、視認性を高める
    subgraph HiddenLayer1[中間層 1]
        H1((○))
        H2((○))
        H3((○))
    end
    
    subgraph HiddenLayer2[中間層 2]
        H4((○))
        H5((○))
        H6((○))
    end
    
    subgraph HiddenLayer3[中間層 3]
        H7((○))
        H8((○))
        H9((○))
    end
    
    %% 出力層 - シンプルでわかりやすく
    subgraph OutputLayer[出力層]
        O1((出力))
        O2((出力))
    end

    %% 入力層から中間層への接続
    A1 --> H1
    A1 --> H2
    A1 --> H3
    A2 --> H1
    A2 --> H2
    A2 --> H3
    A3 --> H1
    A3 --> H2
    A3 --> H3

    %% 中間層1から中間層2への接続(複雑に交差)
    H1 --> H4
    H1 --> H5
    H1 --> H6
    H2 --> H4
    H2 --> H5
    H2 --> H6
    H3 --> H4
    H3 --> H5
    H3 --> H6

    %% 中間層2から中間層3への接続(複雑に交差)
    H4 --> H7
    H4 --> H8
    H4 --> H9
    H5 --> H7
    H5 --> H8
    H5 --> H9
    H6 --> H7
    H6 --> H8
    H6 --> H9

    %% 中間層から出力層への接続(交差)
    H7 --> O1
    H7 --> O2
    H8 --> O1
    H8 --> O2
    H9 --> O1
    H9 --> O2

    %% 出力層から最終出力への接続
    O1 --> 結果[最終出力]
    O2 --> 結果
0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?