Help us understand the problem. What is going on with this article?

Create Pytorch DataLoader from numpy.array

More than 1 year has passed since last update.

scikit-learnのデータセット(ndarray) からPyTorchのDataLoaderを作るのにすこし躓いた.

今後のためにメモ

# データ作成
from sklearn.datasets import fetch_mldata
from sklearn.model_selection import train_test_split

mnist = fetch_mldata("MNIST original")
X = mnist.data.astype(np.float32)  # shape(70000, 784)
y = mnist.target.astype(np.int64)  # shape(70000)


# ndarrayからインスタンスを一つずつ取り出してtorch.Tensorに変換してリストに入れる
tensor_X = torch.stack([torch.from_numpy(np.array(i)) for i in X])
tensor_y = torch.stack([torch.from_numpy(np.array(i)) for i in y])

# trainとtestで分ける
train_size = 60000
X_train = tensor_X[:train_size]
y_train = tensor_y[:train_size]
X_test = tensor_X[train_size:]
y_test = tensor_y[train_size:]

# DataLoaderを作る
train_dataset = torch.utils.data.TensorDataset(X_train, y_train)
train_loader = torch.utils.data.DataLoader(train_loader)

test_dataset = torch.utils.data.TensorDataset(X_test, y_test)
test_loader = torch.utils.data.DataLoader(test_dataset)
X = mnist.data.astype(np.float32)
y = mnist.target.astype(np.int64)

ここで型を変換してる理由は、PyTorchの要求してくる型に合わせるためです。

Why not register and get more from Qiita?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
Comments
Sign up for free and join this conversation.
If you already have a Qiita account
Why do not you register as a user and use Qiita more conveniently?
You need to log in to use this function. Qiita can be used more conveniently after logging in.
You seem to be reading articles frequently this month. Qiita can be used more conveniently after logging in.
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away