0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

ニューラルネットワークと増幅

Posted at

はじめに

三角関数、微積分、指数関数、多項式、対数関数、そしてニューラルネットワークまで一気に学べる、盛りだくさんのPythonコードを考えました。ぜひご活用ください!

Pythonコード

このコードは「K×サイン波入力の出力電圧を、さまざまな方法で可視化・分析」するPythonプログラムです。


全体概要

  • 入力信号:ド(C4, 261.63Hz)のサイン波
  • 出力信号:y = K × x(K=1000)
  • 目的:この出力を「時間軸」「入出力関係」「微分・積分」「多項式近似」「ニューラルネットワーク」で分析する
import numpy as np
import matplotlib.pyplot as plt
import math
from scipy.integrate import cumulative_trapezoid  # 修正済みインポート
from sklearn.neural_network import MLPRegressor
from sklearn.preprocessing import StandardScaler
from sklearn.pipeline import make_pipeline
from numpy.polynomial.polynomial import Polynomial
import warnings
warnings.filterwarnings("ignore")

# --- 基本設定 Basic settings ---
fs = 1000  # サンプリング周波数 Sampling frequency [Hz]
f = 261.63  # ドの周波数 Frequency of note "Do" [Hz]
t = np.linspace(0, 2 / f, fs)  # 時間ベクトル Time vector (2周期)
x = np.sin(2 * np.pi * f * t)  # 入力電圧 Input voltage (sine wave)
K = 1000
y = K * x  # 出力電圧 Output voltage = K × Input

# --- dB表示 Print K in decibels ---
print(f"K = {K}, in dB = {20 * math.log10(K):.2f} dB")

# --- プロット①: 出力 vs 時間 Plot output voltage vs time ---
plt.figure()
plt.plot(t, y, label='Output Voltage (K×Input)')
plt.xlabel("Time [s]")
plt.ylabel("Output Voltage")
plt.title("Output Voltage vs Time")
plt.grid(True)
plt.legend()
plt.show()

# --- プロット②: 出力 vs 入力 Plot output vs input ---
plt.figure()
plt.plot(x, y, label='y = Kx')
plt.xlabel("Input Voltage")
plt.ylabel("Output Voltage")
plt.title("Output = K × Input")
plt.grid(True)
plt.legend()
plt.show()

# --- 微分と積分 Derivative and Integral ---
dy_dt = np.gradient(y, t)                            # 微分 Derivative
int_y = cumulative_trapezoid(y, t, initial=0)        # 修正済み: 積分 Integral (定数0)

plt.figure()
plt.plot(t, dy_dt, label='dy/dt')
plt.plot(t, int_y, label='Integral of y')
plt.xlabel("Time [s]")
plt.ylabel("Voltage")
plt.title("Derivative and Integral of Output Voltage")
plt.legend()
plt.grid(True)
plt.show()

# --- 多項式近似 Polynomial Approximation ---
deg = 3  # 近似次数 Degree of approximation
p_coef = Polynomial.fit(x, y, deg).convert().coef
p = Polynomial(p_coef)
y_poly = p(x)

plt.figure()
plt.plot(x, y, label='True Output')
plt.plot(x, y_poly, '--', label='Polynomial Fit')
plt.xlabel("Input Voltage")
plt.ylabel("Output Voltage")
plt.title(f"Polynomial Fit (Degree {deg})")
plt.legend()
plt.grid(True)
plt.show()

# 多項式係数表示 Show coefficients
print(f"Polynomial coefficients (degree {deg}): {p_coef}")

# --- ニューラルネットワーク回帰 NN Regression ---
activations = ['logistic', 'tanh', 'relu']
for act in activations:
    model = make_pipeline(
        StandardScaler(),
        MLPRegressor(hidden_layer_sizes=(10, 10), activation=act, max_iter=10000, random_state=0)
    )
    model.fit(x.reshape(-1, 1), y)
    y_pred = model.predict(x.reshape(-1, 1))

    plt.figure()
    plt.scatter(x, y, s=10, label='True', alpha=0.5)
    plt.scatter(x, y_pred, s=10, label=f'NN Prediction ({act})', alpha=0.5)
    plt.xlabel("Input Voltage")
    plt.ylabel("Output Voltage")
    plt.title(f"NN Regression - Activation: {act}")
    plt.legend()
    plt.grid(True)
    plt.show()

結果

image.png

image.png

image.png

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?