GAN for Celeb Faces¶

Import Libraries¶

At first we need to import the libraries. It is considered as standard imports.

In [1]:
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.utils.data import DataLoader
from torchvision import datasets, transforms, models
from torchvision.utils import make_grid

from PIL import Image
from IPython.display import display
import cv2


import glob
import os
import random
import numpy as np
import pandas as pd
import pickle as pkl
import matplotlib.pyplot as plt 
%matplotlib inline
import warnings
warnings.filterwarnings('ignore')

We will be using this function mostly everywhere to run our experiments deterministically. Random functions of Numpy and Pandas will behave deterministically after this. To learn more about Deterministic Neural Networks please check out this notebook

In [2]:
def seed_everything(seed=1234):
    random.seed(seed)
    os.environ['PYTHONHASHSEED'] = str(seed)
    np.random.seed(seed)
    torch.manual_seed(seed)
    torch.cuda.manual_seed(seed)
    torch.backends.cudnn.deterministic = True
seed_everything(42)

Dataloader and Visualization¶

Dataloader provides an iterable over the specified dataset by combining a dataset with a sampler.

In [3]:
# Let's define the path
data_dir = '../input/celeba-dataset/img_align_celeba/'
In [4]:
# defining the image transform rules
'''
The augmentions are as follows, 
1. Resizing the image to 32x32 as the smaller the image the faster we can train 
2. Cropping from center with 32x32
3. Chagning type to tensor
we won't be using normalize here as we will have to do that manually later for tanh activation function
'''

transform_img = transforms.Compose([
        transforms.Resize(32),  
        transforms.CenterCrop(32),
        transforms.ToTensor()
    ])
In [5]:
def get_dataloader(batch_size, data_dir, transforms):
    """
    Batch the neural network data using DataLoader
    :param batch_size: The size of each batch; the number of images in a batch
    :param data_dir: Directory where image data is located
    :param transforms: data augmentations
    :return: DataLoader with batched data
    """
    
    ds = datasets.ImageFolder(data_dir, transform_img)
    data_loader = torch.utils.data.DataLoader(ds, batch_size=batch_size,num_workers= 4, shuffle=True)

    return data_loader
In [6]:
batch_size = 32 # Instead of individual samples, the data loader produces batched samples of given number
train_loader = get_dataloader(batch_size,data_dir, transform_img)
In [7]:
print(len(train_loader.dataset))
202599
In [8]:
# Let's see if the train loader is working and sending us iterable images :) 
# Also note that we converted images to tensor above with ToTensor(). Now we need to convert back to numpy to plot them

# obtain one batch of training images which means 32 images
dataiter = iter(train_loader)
img, _ = dataiter.next() # _ for labels, Dataloader sends labels automatically as defined in Dataloader class.
#However we don't need labels neither we assigned one :p so we put _ 
# plot the images in the batch, along with the corresponding labels
fig = plt.figure(figsize=(30, 7))
plot_size=30 # gonna plot 30 images only
for idx in np.arange(plot_size):
    ax = fig.add_subplot(3, plot_size/3, idx+1, xticks=[], yticks=[])
    npimg = img[idx].numpy()
    plt.imshow(np.transpose(npimg, (1, 2, 0)))

Now we need to scale our images as the output of a tanh activated generator will contain pixel values in a range from -1 to 1. So, we need to rescale our training images to a range of -1 to 1.

In [9]:
def scale_images(x, max = 1.00 , min = -1.00):
    x = x * (max - min) + min
    return x
In [10]:
#let's check the scaling
img = img[5]
print('Before scaling min: ', img.min())
print('Before scaling max: ', img.max())

scaled_img = scale_images(img)

print('After Scaling Min: ', scaled_img.min())
print('After Scaling Max: ', scaled_img.max())
Before scaling min:  tensor(0.)
Before scaling max:  tensor(0.9804)
After Scaling Min:  tensor(-1.)
After Scaling Max:  tensor(0.9608)

now our min and max are in range of [-1,1]

Defining Model¶

Discriminator¶

The discriminator is a classifier that detects if the input samples are genuine or fabricated. As a result, the discriminator in a GAN is essentially a classifier. It attempts to discern between genuine data and data generated by the generator. It could utilize any network architecture suitable for the type of data it's classifying. Here I will be using convolutional classifier, only without any maxpooling layers. It is recommended to employ a deep network with normalization to deal with this kind of difficult data.

In [11]:
'''
The inputs to the discriminator are 32x32x3 tensor images
The output would be a single value that will indicate whether a given image is real or fake
'''
def conv(in_channels, out_channels, kernel_size=4, stride=2, padding=1, batch_norm=True, bias = False):
    layers = []
    conv_layer = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, 
                           kernel_size=kernel_size, stride=stride, padding=padding)
    #appending convolutional layer
    layers.append(conv_layer)
    #appending batch norm layer
    if batch_norm:
        layers.append(nn.BatchNorm2d(out_channels))
        
    return nn.Sequential(*layers)
In [12]:
class Discriminator(nn.Module):

    def __init__(self, conv_dim):
        """
        Initializing the Discriminator Module
        :param conv_dim: The depth of the first convolutional layer based on which we will create the  next ones where next  layer depth = 2 * previous layer depth
        """
        super(Discriminator, self).__init__()

        # complete init function
        self.conv_dim = conv_dim
        
        
        self.conv1 = conv(3, conv_dim, batch_norm=False)  
        self.conv2 = conv(conv_dim, conv_dim*2)           
        self.conv3 = conv(conv_dim*2, conv_dim*4)
        self.conv4 = conv(conv_dim*4, conv_dim*8)
        self.fc = nn.Linear(conv_dim*4*4*2, 1)

    def forward(self, x):
        """
        Forward propagation of the neural network
        :param x: The input to the neural network     
        :return: Discriminator logits; the output of the neural network
        """
        # define feedforward behavior
        x = F.leaky_relu(self.conv1(x), 0.2)
        x = F.leaky_relu(self.conv2(x), 0.2)
        x = F.leaky_relu(self.conv3(x), 0.2)
        x = F.leaky_relu(self.conv4(x), 0.2)
        
        x = x.view(-1, self.conv_dim*4*2*4)
        
        x = self.fc(x)
        
        
        return x

Generator¶

The name,itself is self explanatory. The generator component of a GAN learns to generate fake data by incorporating discriminator feedback. It learns to manipulate the discriminator so that its output is classified as real. The generator should upsample an input and create a new image with the same dimensions as our training data (28x28x3). This should mostly consist of transpose convolutional layers with normalzed output.

In [13]:
def deconv(in_channels, out_channels, kernel_size=4, stride=2, padding=1, batch_norm=True, bias = False):
    layers = []
    
    # append transpose conv layer -- we are not using bias terms in conv layers
    layers.append(nn.ConvTranspose2d(in_channels, out_channels, kernel_size, stride, padding))
    
    # optional batch norm layer
    if batch_norm:
        layers.append(nn.BatchNorm2d(out_channels))
        
    return nn.Sequential(*layers)
In [14]:
class Generator(nn.Module):
    
    def __init__(self, z_size, conv_dim):
        """
        Initialize the Generator Module
        :param z_size: The length of the input latent vector, z
        :param conv_dim: The depth of the inputs to the *last* transpose convolutional layer
        """
        super(Generator, self).__init__()
        self.conv_dim = conv_dim
        
        self.fc = nn.Linear(z_size, conv_dim*4*4*4)
        # complete init function
        
        self.de_conv1 = deconv(conv_dim*4, conv_dim*2)
        self.de_conv2 = deconv(conv_dim*2, conv_dim)
        self.de_conv3 = deconv(conv_dim, 3, 4, batch_norm=False )
        
        self.dropout = nn.Dropout(0.3)
        
        
    def forward(self, x):
        """
        Forward propagation of the neural network
        :param x: The input to the neural network     
        :return: A 32x32x3 Tensor image as output
        """
        # define feedforward behavior
        x = self.fc(x)
        x = self.dropout(x)
        
        x = x.view(-1, self.conv_dim*4, 4, 4)
        
        x = F.relu(self.de_conv1(x))
        x = F.relu(self.de_conv2(x))
        x = self.de_conv3(x)
        x = F.tanh(x)
        
        
        return x

To help the models converge, we should initialize the weights of the convolutional and linear layers in your model. From the original DCGAN paper, they say:

All weights were initialized from a zero-centered Normal distribution with standard deviation 0.02.

In [15]:
#Initializing the weights to a normal distribution, centered around 0, with a standard deviation of 0.02.

def weights_init_normal(m):
    """
    :param m: A module or layer in a network    
    """
    # like `Conv`, `BatchNorm2d`, `Linear`, etc.
    classname = m.__class__.__name__
    
    #  initial weights to convolutional and linear layers
    if (classname.find('Conv') != -1 or classname.find('Linear') != -1):
        nn.init.normal(m.weight.data, 0.0, 0.2)
        
    if hasattr(m, 'bias') and m.bias is not None:
        nn.init.constant(m.bias.data, 0.0)
In [16]:
# model hyperparameters
d_conv_dim = 64
g_conv_dim = 128
z_size = 100
# building discriminator and generator from the classes defined above
discriminator = Discriminator(d_conv_dim)
generator = Generator(z_size=z_size, conv_dim=g_conv_dim)

# initialize model weights
discriminator.apply(weights_init_normal)
generator.apply(weights_init_normal)
print("done")
done
In [17]:
# let's look at our discriminator model
print(discriminator)
Discriminator(
  (conv1): Sequential(
    (0): Conv2d(3, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
  )
  (conv2): Sequential(
    (0): Conv2d(64, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (conv3): Sequential(
    (0): Conv2d(128, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (conv4): Sequential(
    (0): Conv2d(256, 512, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (fc): Linear(in_features=2048, out_features=1, bias=True)
)
In [18]:
# let's look at our generator model
print(generator)
Generator(
  (fc): Linear(in_features=100, out_features=8192, bias=True)
  (de_conv1): Sequential(
    (0): ConvTranspose2d(512, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (de_conv2): Sequential(
    (0): ConvTranspose2d(256, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
    (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (de_conv3): Sequential(
    (0): ConvTranspose2d(128, 3, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
  )
  (dropout): Dropout(p=0.3, inplace=False)
)

Training the model¶

In [19]:
use_gpu = torch.cuda.is_available()

first, let's define essentials to train the model.

Optimizers¶

To learn about adam optimizer please check this blog. I found it helpful for me. :)

In [20]:
lr = 0.0002 #learning rate
beta1=0.5
beta2=0.999

# optimizers for the discriminator D and generator G
discriminator_optimizer = torch.optim.Adam(discriminator.parameters(), lr, (beta1, beta2)) # for discriminator
generator_optimizer = torch.optim.Adam(generator.parameters(), lr, (beta1, beta2)) # for generator

Loss functions for generator¶

The generator's goal is to get the discriminator to think its generated images are real.

In [21]:
def real_loss(D_out, smooth = False):
    '''Calculates how close discriminator outputs are to being real.
       param, D_out: discriminator logits
       return: real loss'''
    batch_size = D_out.size(0)
    
    if smooth:
        labels = torch.ones(batch_size)*0.9
    else:
        labels = torch.ones(batch_size) 
    
    if use_gpu:
        labels = labels.cuda()
    
    criterion = nn.BCEWithLogitsLoss()
    loss = criterion(D_out.squeeze(), labels)
    
    return loss

def fake_loss(D_out):
    '''Calculates how close discriminator outputs are to being fake.
       param, D_out: discriminator logits
       return: fake loss'''
    batch_size = D_out.size(0)
    labels = torch.zeros(batch_size)
    
    if use_gpu:
        labels = labels.cuda()
        
    criterion = nn.BCEWithLogitsLoss()
    loss = criterion(D_out.squeeze(), labels)
    
    return loss

This train function was provided in the udacity deep learning nanodegree course. It's great.

Training the discriminator by alternating on real and fake images. Then the generator, which tries to trick the discriminator and should have an opposing loss function.

In [22]:
def train(D, G, n_epochs, train_on_gpu, print_every=50):
    '''Trains adversarial networks for some number of epochs
       param, D: the discriminator network
       param, G: the generator network
       param, n_epochs: number of epochs to train for
       param, print_every: when to print and record the models' losses
       return: D and G losses'''
    
    # move models to GPU
    if train_on_gpu:
        D.cuda()
        G.cuda()

    # keep track of loss and generated, "fake" samples
    samples = []
    losses = []

    # Get some fixed data for sampling. These are images that are held
    # constant throughout training, and allow us to inspect the model's performance
    sample_size=16
    fixed_z = np.random.uniform(-1, 1, size=(sample_size, z_size))
    fixed_z = torch.from_numpy(fixed_z).float()
    # move z to GPU if available
    if train_on_gpu:
        fixed_z = fixed_z.cuda()

    # epoch training loop
    for epoch in range(n_epochs):

        # batch training loop
        for batch_i, (real_images, _) in enumerate(train_loader):

            batch_size = real_images.size(0)
            real_images = scale_images(real_images)

            # Train the discriminator on real and fake images
            discriminator_optimizer.zero_grad()
            
            if train_on_gpu:
                real_images = real_images.cuda()
                
            D_real = D(real_images)
            d_real_loss = real_loss(D_real)

            z = np.random.uniform(-1, 1, size=(batch_size, z_size))
            z = torch.from_numpy(z).float()

            if train_on_gpu:
                z = z.cuda()
                
            fake_images = G(z)
            
            D_fake = D(fake_images)
            d_fake_loss = fake_loss(D_fake)
            
            
            d_loss = d_real_loss + d_fake_loss
            d_loss.backward()
            discriminator_optimizer.step()     

            # 2. Train the generator with an adversarial loss
            generator_optimizer.zero_grad()
            
            z = np.random.uniform(-1, 1, size=(batch_size, z_size))
            z = torch.from_numpy(z).float()
            
            if train_on_gpu:
                z = z.cuda()
            
            fake_images = G(z)
            
            D_fake = D(fake_images)
            
            g_loss = real_loss(D_fake)
        
            g_loss.backward()
            generator_optimizer.step()
            
            # Print some loss stats
            if batch_i % print_every == 0:
                # append discriminator loss and generator loss
                losses.append((d_loss.item(), g_loss.item()))
                # print discriminator and generator loss
                print('Epoch [{:5d}/{:5d}] | d_loss: {:6.4f} | g_loss: {:6.4f}'.format(
                        epoch+1, n_epochs, d_loss.item(), g_loss.item()))


        ## AFTER EACH EPOCH##    
        # this code assumes your generator is named G, feel free to change the name
        # generate and save sample, fake images
        G.eval() # for generating samples
        samples_z = G(fixed_z)
        samples.append(samples_z)
        G.train() # back to training mode

    # Save training generator samples
    with open('train_samples.pkl', 'wb') as f:
        pkl.dump(samples, f)
    
    # finally return losses
    return losses
In [23]:
n_epochs = 10
losses = train(discriminator, generator, n_epochs=n_epochs, train_on_gpu = use_gpu)
Epoch [    1/   10] | d_loss: 6.5612 | g_loss: 2.7663
Epoch [    1/   10] | d_loss: 0.4061 | g_loss: 10.5832
Epoch [    1/   10] | d_loss: 0.0127 | g_loss: 10.5846
Epoch [    1/   10] | d_loss: 0.0047 | g_loss: 10.4223
Epoch [    1/   10] | d_loss: 0.0254 | g_loss: 10.4456
Epoch [    1/   10] | d_loss: 0.0187 | g_loss: 10.8196
Epoch [    1/   10] | d_loss: 0.0109 | g_loss: 10.1270
Epoch [    1/   10] | d_loss: 0.0124 | g_loss: 11.8992
Epoch [    1/   10] | d_loss: 0.0072 | g_loss: 9.8192
Epoch [    1/   10] | d_loss: 0.0547 | g_loss: 11.5808
Epoch [    1/   10] | d_loss: 0.0032 | g_loss: 10.5844
Epoch [    1/   10] | d_loss: 0.0046 | g_loss: 12.2854
Epoch [    1/   10] | d_loss: 0.0003 | g_loss: 10.1991
Epoch [    1/   10] | d_loss: 0.0054 | g_loss: 11.9062
Epoch [    1/   10] | d_loss: 0.0046 | g_loss: 9.0119
Epoch [    1/   10] | d_loss: 0.0155 | g_loss: 8.4228
Epoch [    1/   10] | d_loss: 0.0219 | g_loss: 10.3237
Epoch [    1/   10] | d_loss: 0.0019 | g_loss: 9.8443
Epoch [    1/   10] | d_loss: 0.0011 | g_loss: 12.4850
Epoch [    1/   10] | d_loss: 0.0015 | g_loss: 11.6433
Epoch [    1/   10] | d_loss: 0.0031 | g_loss: 8.3676
Epoch [    1/   10] | d_loss: 0.0020 | g_loss: 11.0091
Epoch [    1/   10] | d_loss: 0.0022 | g_loss: 10.2575
Epoch [    1/   10] | d_loss: 0.0027 | g_loss: 12.3327
Epoch [    1/   10] | d_loss: 0.0116 | g_loss: 12.1517
Epoch [    1/   10] | d_loss: 0.0193 | g_loss: 10.1760
Epoch [    1/   10] | d_loss: 0.0024 | g_loss: 11.0488
Epoch [    1/   10] | d_loss: 0.0010 | g_loss: 11.7721
Epoch [    1/   10] | d_loss: 0.0040 | g_loss: 10.5494
Epoch [    1/   10] | d_loss: 0.0035 | g_loss: 9.5180
Epoch [    1/   10] | d_loss: 0.0029 | g_loss: 10.0865
Epoch [    1/   10] | d_loss: 0.0085 | g_loss: 12.4167
Epoch [    1/   10] | d_loss: 0.0138 | g_loss: 10.8344
Epoch [    1/   10] | d_loss: 0.0016 | g_loss: 14.0485
Epoch [    1/   10] | d_loss: 0.0530 | g_loss: 13.6207
Epoch [    1/   10] | d_loss: 0.0021 | g_loss: 13.2141
Epoch [    1/   10] | d_loss: 0.0170 | g_loss: 14.2300
Epoch [    1/   10] | d_loss: 0.0030 | g_loss: 13.9500
Epoch [    1/   10] | d_loss: 0.0003 | g_loss: 24.5059
Epoch [    1/   10] | d_loss: 0.0016 | g_loss: 11.8578
Epoch [    1/   10] | d_loss: 0.0946 | g_loss: 11.5952
Epoch [    1/   10] | d_loss: 0.0105 | g_loss: 11.8044
Epoch [    1/   10] | d_loss: 0.0025 | g_loss: 12.3131
Epoch [    1/   10] | d_loss: 0.0591 | g_loss: 9.9238
Epoch [    1/   10] | d_loss: 0.0534 | g_loss: 9.9015
Epoch [    1/   10] | d_loss: 0.0961 | g_loss: 7.8785
Epoch [    1/   10] | d_loss: 0.0052 | g_loss: 9.3739
Epoch [    1/   10] | d_loss: 0.0000 | g_loss: 14.0197
Epoch [    1/   10] | d_loss: 0.0186 | g_loss: 8.3841
Epoch [    1/   10] | d_loss: 0.0004 | g_loss: 10.6725
Epoch [    1/   10] | d_loss: 0.0105 | g_loss: 8.2191
Epoch [    1/   10] | d_loss: 0.0514 | g_loss: 11.3652
Epoch [    1/   10] | d_loss: 0.0380 | g_loss: 10.7757
Epoch [    1/   10] | d_loss: 0.0203 | g_loss: 10.7881
Epoch [    1/   10] | d_loss: 0.0026 | g_loss: 10.4199
Epoch [    1/   10] | d_loss: 0.0076 | g_loss: 7.4264
Epoch [    1/   10] | d_loss: 0.0159 | g_loss: 9.1459
Epoch [    1/   10] | d_loss: 0.0157 | g_loss: 11.7919
Epoch [    1/   10] | d_loss: 0.2854 | g_loss: 12.6661
Epoch [    1/   10] | d_loss: 0.0307 | g_loss: 11.1065
Epoch [    1/   10] | d_loss: 0.0473 | g_loss: 11.0584
Epoch [    1/   10] | d_loss: 0.0345 | g_loss: 6.9105
Epoch [    1/   10] | d_loss: 0.0663 | g_loss: 8.7302
Epoch [    1/   10] | d_loss: 0.0596 | g_loss: 9.4033
Epoch [    1/   10] | d_loss: 0.0217 | g_loss: 5.6640
Epoch [    1/   10] | d_loss: 0.0822 | g_loss: 10.3895
Epoch [    1/   10] | d_loss: 0.0470 | g_loss: 9.8816
Epoch [    1/   10] | d_loss: 0.3291 | g_loss: 9.6115
Epoch [    1/   10] | d_loss: 0.2623 | g_loss: 12.2759
Epoch [    1/   10] | d_loss: 0.0542 | g_loss: 6.6386
Epoch [    1/   10] | d_loss: 0.0344 | g_loss: 7.7575
Epoch [    1/   10] | d_loss: 0.6824 | g_loss: 4.4776
Epoch [    1/   10] | d_loss: 0.0414 | g_loss: 8.3927
Epoch [    1/   10] | d_loss: 0.2055 | g_loss: 7.8103
Epoch [    1/   10] | d_loss: 0.0100 | g_loss: 10.5721
Epoch [    1/   10] | d_loss: 0.0766 | g_loss: 6.7549
Epoch [    1/   10] | d_loss: 0.0406 | g_loss: 5.9111
Epoch [    1/   10] | d_loss: 0.8992 | g_loss: 10.8868
Epoch [    1/   10] | d_loss: 0.0076 | g_loss: 11.8544
Epoch [    1/   10] | d_loss: 0.5268 | g_loss: 7.4693
Epoch [    1/   10] | d_loss: 0.0567 | g_loss: 6.4553
Epoch [    1/   10] | d_loss: 0.0939 | g_loss: 6.2571
Epoch [    1/   10] | d_loss: 0.1102 | g_loss: 7.3274
Epoch [    1/   10] | d_loss: 0.1671 | g_loss: 9.9786
Epoch [    1/   10] | d_loss: 0.3761 | g_loss: 6.2370
Epoch [    1/   10] | d_loss: 0.0769 | g_loss: 6.7871
Epoch [    1/   10] | d_loss: 0.1363 | g_loss: 4.8757
Epoch [    1/   10] | d_loss: 0.0075 | g_loss: 6.5986
Epoch [    1/   10] | d_loss: 0.2132 | g_loss: 3.8577
Epoch [    1/   10] | d_loss: 0.0983 | g_loss: 8.7573
Epoch [    1/   10] | d_loss: 0.0228 | g_loss: 10.0541
Epoch [    1/   10] | d_loss: 0.0277 | g_loss: 6.8616
Epoch [    1/   10] | d_loss: 0.2250 | g_loss: 5.5089
Epoch [    1/   10] | d_loss: 0.1363 | g_loss: 6.6739
Epoch [    1/   10] | d_loss: 0.0890 | g_loss: 5.0775
Epoch [    1/   10] | d_loss: 0.0655 | g_loss: 9.6614
Epoch [    1/   10] | d_loss: 0.0215 | g_loss: 3.0873
Epoch [    1/   10] | d_loss: 0.1568 | g_loss: 0.8227
Epoch [    1/   10] | d_loss: 0.1720 | g_loss: 11.1437
Epoch [    1/   10] | d_loss: 0.0219 | g_loss: 6.9864
Epoch [    1/   10] | d_loss: 0.0110 | g_loss: 3.8512
Epoch [    1/   10] | d_loss: 0.5016 | g_loss: 8.8873
Epoch [    1/   10] | d_loss: 0.2661 | g_loss: 4.9178
Epoch [    1/   10] | d_loss: 0.7582 | g_loss: 3.3910
Epoch [    1/   10] | d_loss: 0.2950 | g_loss: 4.4096
Epoch [    1/   10] | d_loss: 0.3192 | g_loss: 6.6792
Epoch [    1/   10] | d_loss: 1.7131 | g_loss: 2.3871
Epoch [    1/   10] | d_loss: 0.0504 | g_loss: 7.2594
Epoch [    1/   10] | d_loss: 0.2466 | g_loss: 4.5237
Epoch [    1/   10] | d_loss: 0.0284 | g_loss: 4.7077
Epoch [    1/   10] | d_loss: 0.5693 | g_loss: 5.1687
Epoch [    1/   10] | d_loss: 0.0455 | g_loss: 6.1223
Epoch [    1/   10] | d_loss: 0.0553 | g_loss: 4.0796
Epoch [    1/   10] | d_loss: 0.3348 | g_loss: 4.7157
Epoch [    1/   10] | d_loss: 0.5057 | g_loss: 3.8312
Epoch [    1/   10] | d_loss: 0.1226 | g_loss: 5.1582
Epoch [    1/   10] | d_loss: 0.2858 | g_loss: 4.2976
Epoch [    1/   10] | d_loss: 0.1123 | g_loss: 2.8502
Epoch [    1/   10] | d_loss: 0.1008 | g_loss: 4.5108
Epoch [    1/   10] | d_loss: 0.2572 | g_loss: 3.6568
Epoch [    1/   10] | d_loss: 0.1741 | g_loss: 6.1714
Epoch [    1/   10] | d_loss: 0.1814 | g_loss: 2.3431
Epoch [    1/   10] | d_loss: 0.0469 | g_loss: 5.1448
Epoch [    1/   10] | d_loss: 0.0619 | g_loss: 5.5334
Epoch [    1/   10] | d_loss: 0.0964 | g_loss: 6.3205
Epoch [    1/   10] | d_loss: 0.1182 | g_loss: 5.1639
Epoch [    1/   10] | d_loss: 0.3784 | g_loss: 2.4853
Epoch [    2/   10] | d_loss: 0.2726 | g_loss: 4.4901
Epoch [    2/   10] | d_loss: 0.3631 | g_loss: 4.4139
Epoch [    2/   10] | d_loss: 0.2347 | g_loss: 5.4626
Epoch [    2/   10] | d_loss: 0.4665 | g_loss: 5.5219
Epoch [    2/   10] | d_loss: 0.2939 | g_loss: 3.5983
Epoch [    2/   10] | d_loss: 0.2237 | g_loss: 5.6837
Epoch [    2/   10] | d_loss: 0.1753 | g_loss: 4.6437
Epoch [    2/   10] | d_loss: 0.1679 | g_loss: 3.3068
Epoch [    2/   10] | d_loss: 0.6438 | g_loss: 3.1037
Epoch [    2/   10] | d_loss: 0.1832 | g_loss: 5.8347
Epoch [    2/   10] | d_loss: 0.3083 | g_loss: 2.4997
Epoch [    2/   10] | d_loss: 0.1968 | g_loss: 3.3463
Epoch [    2/   10] | d_loss: 0.2497 | g_loss: 5.1440
Epoch [    2/   10] | d_loss: 0.1964 | g_loss: 4.1183
Epoch [    2/   10] | d_loss: 0.2497 | g_loss: 4.5229
Epoch [    2/   10] | d_loss: 0.5016 | g_loss: 5.6281
Epoch [    2/   10] | d_loss: 0.4161 | g_loss: 5.2014
Epoch [    2/   10] | d_loss: 0.1082 | g_loss: 5.9230
Epoch [    2/   10] | d_loss: 0.2682 | g_loss: 3.5605
Epoch [    2/   10] | d_loss: 0.4226 | g_loss: 6.3418
Epoch [    2/   10] | d_loss: 0.0063 | g_loss: 7.0223
Epoch [    2/   10] | d_loss: 0.1998 | g_loss: 2.9654
Epoch [    2/   10] | d_loss: 0.1866 | g_loss: 7.2672
Epoch [    2/   10] | d_loss: 0.1974 | g_loss: 4.5907
Epoch [    2/   10] | d_loss: 0.3515 | g_loss: 4.8020
Epoch [    2/   10] | d_loss: 0.0862 | g_loss: 5.6519
Epoch [    2/   10] | d_loss: 0.7186 | g_loss: 1.6936
Epoch [    2/   10] | d_loss: 0.2804 | g_loss: 3.0086
Epoch [    2/   10] | d_loss: 0.8561 | g_loss: 2.7902
Epoch [    2/   10] | d_loss: 0.4826 | g_loss: 5.2951
Epoch [    2/   10] | d_loss: 0.1849 | g_loss: 4.8776
Epoch [    2/   10] | d_loss: 0.1861 | g_loss: 5.2870
Epoch [    2/   10] | d_loss: 0.1870 | g_loss: 4.5270
Epoch [    2/   10] | d_loss: 0.4977 | g_loss: 7.2100
Epoch [    2/   10] | d_loss: 1.3070 | g_loss: 3.8317
Epoch [    2/   10] | d_loss: 0.4469 | g_loss: 1.9545
Epoch [    2/   10] | d_loss: 0.8279 | g_loss: 1.5046
Epoch [    2/   10] | d_loss: 0.4763 | g_loss: 3.1440
Epoch [    2/   10] | d_loss: 0.1848 | g_loss: 5.9210
Epoch [    2/   10] | d_loss: 0.2376 | g_loss: 6.1054
Epoch [    2/   10] | d_loss: 0.9626 | g_loss: 2.7557
Epoch [    2/   10] | d_loss: 0.0700 | g_loss: 4.3756
Epoch [    2/   10] | d_loss: 0.0214 | g_loss: 4.8386
Epoch [    2/   10] | d_loss: 0.1261 | g_loss: 5.6933
Epoch [    2/   10] | d_loss: 2.0805 | g_loss: 4.7206
Epoch [    2/   10] | d_loss: 0.0419 | g_loss: 7.0091
Epoch [    2/   10] | d_loss: 0.1578 | g_loss: 5.9049
Epoch [    2/   10] | d_loss: 1.0964 | g_loss: 6.2078
Epoch [    2/   10] | d_loss: 1.3335 | g_loss: 3.2506
Epoch [    2/   10] | d_loss: 0.2940 | g_loss: 2.7822
Epoch [    2/   10] | d_loss: 0.3649 | g_loss: 2.0819
Epoch [    2/   10] | d_loss: 0.4200 | g_loss: 3.6504
Epoch [    2/   10] | d_loss: 0.0610 | g_loss: 5.1190
Epoch [    2/   10] | d_loss: 0.4289 | g_loss: 4.0544
Epoch [    2/   10] | d_loss: 0.6590 | g_loss: 3.1107
Epoch [    2/   10] | d_loss: 0.0940 | g_loss: 5.0114
Epoch [    2/   10] | d_loss: 0.4982 | g_loss: 1.5809
Epoch [    2/   10] | d_loss: 0.9229 | g_loss: 2.8034
Epoch [    2/   10] | d_loss: 0.1148 | g_loss: 5.7501
Epoch [    2/   10] | d_loss: 0.9571 | g_loss: 6.3230
Epoch [    2/   10] | d_loss: 0.0534 | g_loss: 5.7814
Epoch [    2/   10] | d_loss: 0.0802 | g_loss: 5.1241
Epoch [    2/   10] | d_loss: 0.6654 | g_loss: 2.7028
Epoch [    2/   10] | d_loss: 0.2389 | g_loss: 3.2261
Epoch [    2/   10] | d_loss: 0.0678 | g_loss: 1.8828
Epoch [    2/   10] | d_loss: 0.1144 | g_loss: 4.0934
Epoch [    2/   10] | d_loss: 0.1343 | g_loss: 3.1380
Epoch [    2/   10] | d_loss: 0.0642 | g_loss: 3.1734
Epoch [    2/   10] | d_loss: 0.2441 | g_loss: 6.4035
Epoch [    2/   10] | d_loss: 0.1800 | g_loss: 0.8455
Epoch [    2/   10] | d_loss: 0.4813 | g_loss: 2.8280
Epoch [    2/   10] | d_loss: 0.2372 | g_loss: 5.8582
Epoch [    2/   10] | d_loss: 0.6379 | g_loss: 2.9334
Epoch [    2/   10] | d_loss: 0.1571 | g_loss: 5.1774
Epoch [    2/   10] | d_loss: 0.0922 | g_loss: 3.4765
Epoch [    2/   10] | d_loss: 0.2026 | g_loss: 1.7852
Epoch [    2/   10] | d_loss: 0.0451 | g_loss: 5.2841
Epoch [    2/   10] | d_loss: 0.2238 | g_loss: 3.5579
Epoch [    2/   10] | d_loss: 0.1153 | g_loss: 6.8041
Epoch [    2/   10] | d_loss: 0.4087 | g_loss: 2.1241
Epoch [    2/   10] | d_loss: 0.7863 | g_loss: 5.0609
Epoch [    2/   10] | d_loss: 0.5779 | g_loss: 3.2061
Epoch [    2/   10] | d_loss: 0.3594 | g_loss: 4.6189
Epoch [    2/   10] | d_loss: 0.3185 | g_loss: 4.1418
Epoch [    2/   10] | d_loss: 0.2936 | g_loss: 2.8636
Epoch [    2/   10] | d_loss: 0.1876 | g_loss: 2.7241
Epoch [    2/   10] | d_loss: 0.2179 | g_loss: 5.0370
Epoch [    2/   10] | d_loss: 0.7829 | g_loss: 1.4983
Epoch [    2/   10] | d_loss: 0.2084 | g_loss: 0.2557
Epoch [    2/   10] | d_loss: 0.3403 | g_loss: 6.2224
Epoch [    2/   10] | d_loss: 0.0183 | g_loss: 3.8625
Epoch [    2/   10] | d_loss: 2.0000 | g_loss: 5.0150
Epoch [    2/   10] | d_loss: 0.2416 | g_loss: 5.6296
Epoch [    2/   10] | d_loss: 0.3280 | g_loss: 3.6796
Epoch [    2/   10] | d_loss: 0.7392 | g_loss: 1.8084
Epoch [    2/   10] | d_loss: 0.0502 | g_loss: 4.3625
Epoch [    2/   10] | d_loss: 0.1318 | g_loss: 10.0443
Epoch [    2/   10] | d_loss: 0.2095 | g_loss: 4.9465
Epoch [    2/   10] | d_loss: 0.1078 | g_loss: 3.4908
Epoch [    2/   10] | d_loss: 0.1178 | g_loss: 2.4864
Epoch [    2/   10] | d_loss: 0.6739 | g_loss: 2.3877
Epoch [    2/   10] | d_loss: 1.3405 | g_loss: 0.3277
Epoch [    2/   10] | d_loss: 0.0599 | g_loss: 4.9404
Epoch [    2/   10] | d_loss: 0.2875 | g_loss: 3.3113
Epoch [    2/   10] | d_loss: 0.5506 | g_loss: 1.2120
Epoch [    2/   10] | d_loss: 0.3786 | g_loss: 0.6528
Epoch [    2/   10] | d_loss: 0.2284 | g_loss: 3.4423
Epoch [    2/   10] | d_loss: 0.2511 | g_loss: 3.0726
Epoch [    2/   10] | d_loss: 0.5499 | g_loss: 3.8542
Epoch [    2/   10] | d_loss: 0.1051 | g_loss: 3.9711
Epoch [    2/   10] | d_loss: 0.1287 | g_loss: 5.3084
Epoch [    2/   10] | d_loss: 0.4049 | g_loss: 4.4216
Epoch [    2/   10] | d_loss: 0.7605 | g_loss: 1.8537
Epoch [    2/   10] | d_loss: 0.2867 | g_loss: 4.8505
Epoch [    2/   10] | d_loss: 0.0399 | g_loss: 2.3845
Epoch [    2/   10] | d_loss: 0.1557 | g_loss: 3.1595
Epoch [    2/   10] | d_loss: 0.3014 | g_loss: 2.1452
Epoch [    2/   10] | d_loss: 0.3286 | g_loss: 3.2155
Epoch [    2/   10] | d_loss: 0.4352 | g_loss: 4.6193
Epoch [    2/   10] | d_loss: 0.3092 | g_loss: 2.1758
Epoch [    2/   10] | d_loss: 0.3013 | g_loss: 2.9884
Epoch [    2/   10] | d_loss: 0.0718 | g_loss: 4.3302
Epoch [    2/   10] | d_loss: 0.1787 | g_loss: 4.0325
Epoch [    2/   10] | d_loss: 0.0596 | g_loss: 1.6709
Epoch [    2/   10] | d_loss: 0.2336 | g_loss: 4.4792
Epoch [    2/   10] | d_loss: 0.0702 | g_loss: 2.4916
Epoch [    2/   10] | d_loss: 0.2937 | g_loss: 3.8225
Epoch [    3/   10] | d_loss: 0.2199 | g_loss: 2.9258
Epoch [    3/   10] | d_loss: 0.1506 | g_loss: 4.1126
Epoch [    3/   10] | d_loss: 1.0625 | g_loss: 1.5129
Epoch [    3/   10] | d_loss: 0.4710 | g_loss: 3.3993
Epoch [    3/   10] | d_loss: 0.4532 | g_loss: 2.9902
Epoch [    3/   10] | d_loss: 0.1384 | g_loss: 2.8079
Epoch [    3/   10] | d_loss: 0.6568 | g_loss: 4.4424
Epoch [    3/   10] | d_loss: 0.2210 | g_loss: 4.7413
Epoch [    3/   10] | d_loss: 0.0421 | g_loss: 1.9983
Epoch [    3/   10] | d_loss: 0.2462 | g_loss: 3.2651
Epoch [    3/   10] | d_loss: 0.0917 | g_loss: 4.0004
Epoch [    3/   10] | d_loss: 0.1812 | g_loss: 3.8898
Epoch [    3/   10] | d_loss: 0.1511 | g_loss: 3.7244
Epoch [    3/   10] | d_loss: 0.1804 | g_loss: 4.2387
Epoch [    3/   10] | d_loss: 0.4746 | g_loss: 1.3865
Epoch [    3/   10] | d_loss: 0.1729 | g_loss: 3.8289
Epoch [    3/   10] | d_loss: 0.4361 | g_loss: 4.9219
Epoch [    3/   10] | d_loss: 0.5194 | g_loss: 3.7778
Epoch [    3/   10] | d_loss: 0.0620 | g_loss: 5.8259
Epoch [    3/   10] | d_loss: 0.3692 | g_loss: 3.0924
Epoch [    3/   10] | d_loss: 0.5095 | g_loss: 6.1030
Epoch [    3/   10] | d_loss: 0.1510 | g_loss: 3.1804
Epoch [    3/   10] | d_loss: 0.4919 | g_loss: 3.1111
Epoch [    3/   10] | d_loss: 0.4353 | g_loss: 4.6197
Epoch [    3/   10] | d_loss: 0.1907 | g_loss: 4.9387
Epoch [    3/   10] | d_loss: 0.1241 | g_loss: 3.7829
Epoch [    3/   10] | d_loss: 0.2396 | g_loss: 4.4582
Epoch [    3/   10] | d_loss: 0.3135 | g_loss: 4.1047
Epoch [    3/   10] | d_loss: 1.9251 | g_loss: 2.9336
Epoch [    3/   10] | d_loss: 0.2517 | g_loss: 3.7119
Epoch [    3/   10] | d_loss: 0.5041 | g_loss: 4.1058
Epoch [    3/   10] | d_loss: 0.0624 | g_loss: 3.8203
Epoch [    3/   10] | d_loss: 0.0868 | g_loss: 2.6004
Epoch [    3/   10] | d_loss: 0.4406 | g_loss: 4.8636
Epoch [    3/   10] | d_loss: 0.0923 | g_loss: 3.7297
Epoch [    3/   10] | d_loss: 0.1091 | g_loss: 2.2106
Epoch [    3/   10] | d_loss: 0.0415 | g_loss: 4.0525
Epoch [    3/   10] | d_loss: 0.1758 | g_loss: 2.6423
Epoch [    3/   10] | d_loss: 0.3350 | g_loss: 2.5157
Epoch [    3/   10] | d_loss: 0.1055 | g_loss: 4.1598
Epoch [    3/   10] | d_loss: 0.4103 | g_loss: 2.1698
Epoch [    3/   10] | d_loss: 0.1914 | g_loss: 5.2261
Epoch [    3/   10] | d_loss: 0.7287 | g_loss: 3.0684
Epoch [    3/   10] | d_loss: 0.2370 | g_loss: 3.2712
Epoch [    3/   10] | d_loss: 0.2194 | g_loss: 5.1428
Epoch [    3/   10] | d_loss: 0.1780 | g_loss: 4.3766
Epoch [    3/   10] | d_loss: 0.1618 | g_loss: 2.8060
Epoch [    3/   10] | d_loss: 0.2621 | g_loss: 1.9483
Epoch [    3/   10] | d_loss: 0.0569 | g_loss: 5.3546
Epoch [    3/   10] | d_loss: 0.1321 | g_loss: 4.1521
Epoch [    3/   10] | d_loss: 0.2791 | g_loss: 3.7103
Epoch [    3/   10] | d_loss: 0.1659 | g_loss: 3.4748
Epoch [    3/   10] | d_loss: 0.2331 | g_loss: 3.4091
Epoch [    3/   10] | d_loss: 0.9236 | g_loss: 4.0493
Epoch [    3/   10] | d_loss: 0.8660 | g_loss: 2.5107
Epoch [    3/   10] | d_loss: 0.2029 | g_loss: 2.8962
Epoch [    3/   10] | d_loss: 0.3649 | g_loss: 3.4344
Epoch [    3/   10] | d_loss: 0.2410 | g_loss: 5.3699
Epoch [    3/   10] | d_loss: 0.1791 | g_loss: 2.7159
Epoch [    3/   10] | d_loss: 0.3142 | g_loss: 3.7032
Epoch [    3/   10] | d_loss: 0.2776 | g_loss: 2.9422
Epoch [    3/   10] | d_loss: 0.3638 | g_loss: 4.4104
Epoch [    3/   10] | d_loss: 0.1798 | g_loss: 2.6947
Epoch [    3/   10] | d_loss: 0.2442 | g_loss: 4.4548
Epoch [    3/   10] | d_loss: 0.3466 | g_loss: 7.0377
Epoch [    3/   10] | d_loss: 0.0581 | g_loss: 3.2362
Epoch [    3/   10] | d_loss: 0.2970 | g_loss: 3.0458
Epoch [    3/   10] | d_loss: 0.1832 | g_loss: 1.7398
Epoch [    3/   10] | d_loss: 0.0653 | g_loss: 3.1470
Epoch [    3/   10] | d_loss: 0.4225 | g_loss: 3.0466
Epoch [    3/   10] | d_loss: 0.1011 | g_loss: 5.6835
Epoch [    3/   10] | d_loss: 0.1287 | g_loss: 5.4939
Epoch [    3/   10] | d_loss: 0.3078 | g_loss: 4.6357
Epoch [    3/   10] | d_loss: 0.0709 | g_loss: 3.2173
Epoch [    3/   10] | d_loss: 0.1292 | g_loss: 3.4273
Epoch [    3/   10] | d_loss: 0.1642 | g_loss: 3.9554
Epoch [    3/   10] | d_loss: 0.0601 | g_loss: 5.7750
Epoch [    3/   10] | d_loss: 0.1028 | g_loss: 5.6704
Epoch [    3/   10] | d_loss: 0.0784 | g_loss: 3.8628
Epoch [    3/   10] | d_loss: 0.3496 | g_loss: 1.8017
Epoch [    3/   10] | d_loss: 0.2284 | g_loss: 5.6446
Epoch [    3/   10] | d_loss: 0.1010 | g_loss: 2.0211
Epoch [    3/   10] | d_loss: 0.1760 | g_loss: 4.0356
Epoch [    3/   10] | d_loss: 0.1804 | g_loss: 2.4773
Epoch [    3/   10] | d_loss: 0.0780 | g_loss: 4.3742
Epoch [    3/   10] | d_loss: 0.1208 | g_loss: 4.1628
Epoch [    3/   10] | d_loss: 0.4025 | g_loss: 3.8751
Epoch [    3/   10] | d_loss: 0.3998 | g_loss: 5.4560
Epoch [    3/   10] | d_loss: 0.1260 | g_loss: 6.4507
Epoch [    3/   10] | d_loss: 0.0625 | g_loss: 4.1512
Epoch [    3/   10] | d_loss: 0.0481 | g_loss: 4.9909
Epoch [    3/   10] | d_loss: 0.0783 | g_loss: 5.1531
Epoch [    3/   10] | d_loss: 0.2354 | g_loss: 5.9851
Epoch [    3/   10] | d_loss: 0.0207 | g_loss: 4.3313
Epoch [    3/   10] | d_loss: 0.1145 | g_loss: 4.3619
Epoch [    3/   10] | d_loss: 0.1960 | g_loss: 4.1260
Epoch [    3/   10] | d_loss: 0.0802 | g_loss: 2.9538
Epoch [    3/   10] | d_loss: 0.1071 | g_loss: 5.5024
Epoch [    3/   10] | d_loss: 0.1706 | g_loss: 4.4858
Epoch [    3/   10] | d_loss: 0.1441 | g_loss: 5.1059
Epoch [    3/   10] | d_loss: 0.1908 | g_loss: 5.0755
Epoch [    3/   10] | d_loss: 0.0521 | g_loss: 4.7709
Epoch [    3/   10] | d_loss: 0.0960 | g_loss: 2.4906
Epoch [    3/   10] | d_loss: 0.1763 | g_loss: 3.1886
Epoch [    3/   10] | d_loss: 0.3287 | g_loss: 2.3894
Epoch [    3/   10] | d_loss: 0.4585 | g_loss: 4.0137
Epoch [    3/   10] | d_loss: 0.3306 | g_loss: 3.0020
Epoch [    3/   10] | d_loss: 0.0673 | g_loss: 6.8703
Epoch [    3/   10] | d_loss: 0.1181 | g_loss: 4.0016
Epoch [    3/   10] | d_loss: 0.6394 | g_loss: 3.2278
Epoch [    3/   10] | d_loss: 0.0823 | g_loss: 5.2804
Epoch [    3/   10] | d_loss: 0.1481 | g_loss: 2.7297
Epoch [    3/   10] | d_loss: 0.2790 | g_loss: 6.1307
Epoch [    3/   10] | d_loss: 0.3371 | g_loss: 3.5075
Epoch [    3/   10] | d_loss: 0.0266 | g_loss: 2.2404
Epoch [    3/   10] | d_loss: 0.4937 | g_loss: 2.6322
Epoch [    3/   10] | d_loss: 0.1896 | g_loss: 1.7653
Epoch [    3/   10] | d_loss: 0.0947 | g_loss: 1.0773
Epoch [    3/   10] | d_loss: 0.7381 | g_loss: 2.4527
Epoch [    3/   10] | d_loss: 0.0406 | g_loss: 6.3258
Epoch [    3/   10] | d_loss: 0.1086 | g_loss: 4.7672
Epoch [    3/   10] | d_loss: 0.0830 | g_loss: 4.8638
Epoch [    3/   10] | d_loss: 0.1092 | g_loss: 5.2602
Epoch [    3/   10] | d_loss: 0.1120 | g_loss: 3.9764
Epoch [    3/   10] | d_loss: 0.2695 | g_loss: 3.4607
Epoch [    3/   10] | d_loss: 0.0757 | g_loss: 4.3291
Epoch [    3/   10] | d_loss: 0.0287 | g_loss: 3.4277
Epoch [    4/   10] | d_loss: 0.2125 | g_loss: 2.4440
Epoch [    4/   10] | d_loss: 0.1450 | g_loss: 8.6633
Epoch [    4/   10] | d_loss: 0.0914 | g_loss: 5.0463
Epoch [    4/   10] | d_loss: 0.1461 | g_loss: 3.0189
Epoch [    4/   10] | d_loss: 0.0608 | g_loss: 4.0131
Epoch [    4/   10] | d_loss: 0.0241 | g_loss: 6.2226
Epoch [    4/   10] | d_loss: 0.1233 | g_loss: 5.0932
Epoch [    4/   10] | d_loss: 0.1913 | g_loss: 4.1441
Epoch [    4/   10] | d_loss: 0.0122 | g_loss: 4.7539
Epoch [    4/   10] | d_loss: 0.1296 | g_loss: 4.9629
Epoch [    4/   10] | d_loss: 0.0543 | g_loss: 5.8628
Epoch [    4/   10] | d_loss: 0.1735 | g_loss: 4.0175
Epoch [    4/   10] | d_loss: 0.1398 | g_loss: 2.3968
Epoch [    4/   10] | d_loss: 0.1858 | g_loss: 5.6189
Epoch [    4/   10] | d_loss: 0.2422 | g_loss: 2.9578
Epoch [    4/   10] | d_loss: 0.1998 | g_loss: 2.2886
Epoch [    4/   10] | d_loss: 0.0369 | g_loss: 3.9959
Epoch [    4/   10] | d_loss: 0.1382 | g_loss: 4.1468
Epoch [    4/   10] | d_loss: 0.1230 | g_loss: 3.8452
Epoch [    4/   10] | d_loss: 0.2036 | g_loss: 4.3849
Epoch [    4/   10] | d_loss: 0.3088 | g_loss: 5.0176
Epoch [    4/   10] | d_loss: 0.2094 | g_loss: 6.7736
Epoch [    4/   10] | d_loss: 0.0984 | g_loss: 4.7217
Epoch [    4/   10] | d_loss: 0.1609 | g_loss: 6.3982
Epoch [    4/   10] | d_loss: 0.3306 | g_loss: 5.4531
Epoch [    4/   10] | d_loss: 0.2462 | g_loss: 3.2141
Epoch [    4/   10] | d_loss: 0.1678 | g_loss: 4.1087
Epoch [    4/   10] | d_loss: 0.0678 | g_loss: 4.6951
Epoch [    4/   10] | d_loss: 0.0291 | g_loss: 5.4545
Epoch [    4/   10] | d_loss: 0.1617 | g_loss: 3.0568
Epoch [    4/   10] | d_loss: 0.9932 | g_loss: 6.0311
Epoch [    4/   10] | d_loss: 0.0517 | g_loss: 2.6535
Epoch [    4/   10] | d_loss: 0.1546 | g_loss: 5.0831
Epoch [    4/   10] | d_loss: 0.1204 | g_loss: 2.5063
Epoch [    4/   10] | d_loss: 0.4148 | g_loss: 2.0356
Epoch [    4/   10] | d_loss: 0.2129 | g_loss: 3.9287
Epoch [    4/   10] | d_loss: 0.0320 | g_loss: 4.3227
Epoch [    4/   10] | d_loss: 0.0475 | g_loss: 3.5744
Epoch [    4/   10] | d_loss: 0.0387 | g_loss: 3.7123
Epoch [    4/   10] | d_loss: 0.0433 | g_loss: 6.7806
Epoch [    4/   10] | d_loss: 0.0197 | g_loss: 3.9351
Epoch [    4/   10] | d_loss: 0.0455 | g_loss: 4.3975
Epoch [    4/   10] | d_loss: 0.2107 | g_loss: 4.1423
Epoch [    4/   10] | d_loss: 0.2275 | g_loss: 6.1535
Epoch [    4/   10] | d_loss: 0.1032 | g_loss: 4.9790
Epoch [    4/   10] | d_loss: 0.0217 | g_loss: 5.0036
Epoch [    4/   10] | d_loss: 0.1393 | g_loss: 4.4989
Epoch [    4/   10] | d_loss: 0.0268 | g_loss: 5.1859
Epoch [    4/   10] | d_loss: 0.0418 | g_loss: 4.9466
Epoch [    4/   10] | d_loss: 0.2015 | g_loss: 4.8974
Epoch [    4/   10] | d_loss: 0.1004 | g_loss: 4.4017
Epoch [    4/   10] | d_loss: 0.0404 | g_loss: 2.2861
Epoch [    4/   10] | d_loss: 0.0340 | g_loss: 5.5060
Epoch [    4/   10] | d_loss: 0.0720 | g_loss: 4.8337
Epoch [    4/   10] | d_loss: 0.1311 | g_loss: 5.3325
Epoch [    4/   10] | d_loss: 0.0756 | g_loss: 5.7049
Epoch [    4/   10] | d_loss: 0.0410 | g_loss: 7.4611
Epoch [    4/   10] | d_loss: 0.0960 | g_loss: 6.1853
Epoch [    4/   10] | d_loss: 0.0970 | g_loss: 4.2016
Epoch [    4/   10] | d_loss: 0.0176 | g_loss: 4.4856
Epoch [    4/   10] | d_loss: 0.0616 | g_loss: 5.1462
Epoch [    4/   10] | d_loss: 0.1297 | g_loss: 3.8863
Epoch [    4/   10] | d_loss: 0.0338 | g_loss: 4.0138
Epoch [    4/   10] | d_loss: 0.0276 | g_loss: 5.0711
Epoch [    4/   10] | d_loss: 0.1078 | g_loss: 6.3890
Epoch [    4/   10] | d_loss: 0.3213 | g_loss: 4.2944
Epoch [    4/   10] | d_loss: 0.0793 | g_loss: 5.5344
Epoch [    4/   10] | d_loss: 0.0400 | g_loss: 3.9249
Epoch [    4/   10] | d_loss: 0.1020 | g_loss: 4.9613
Epoch [    4/   10] | d_loss: 0.0080 | g_loss: 6.4153
Epoch [    4/   10] | d_loss: 0.1660 | g_loss: 3.2735
Epoch [    4/   10] | d_loss: 0.2302 | g_loss: 2.9156
Epoch [    4/   10] | d_loss: 0.0392 | g_loss: 3.5625
Epoch [    4/   10] | d_loss: 0.4779 | g_loss: 3.3062
Epoch [    4/   10] | d_loss: 0.1969 | g_loss: 4.1006
Epoch [    4/   10] | d_loss: 0.0460 | g_loss: 5.0866
Epoch [    4/   10] | d_loss: 0.3755 | g_loss: 4.5103
Epoch [    4/   10] | d_loss: 0.0502 | g_loss: 3.9868
Epoch [    4/   10] | d_loss: 0.0698 | g_loss: 5.9437
Epoch [    4/   10] | d_loss: 0.0526 | g_loss: 4.6566
Epoch [    4/   10] | d_loss: 0.0354 | g_loss: 4.6226
Epoch [    4/   10] | d_loss: 0.0772 | g_loss: 4.6577
Epoch [    4/   10] | d_loss: 0.1223 | g_loss: 3.7151
Epoch [    4/   10] | d_loss: 0.1004 | g_loss: 4.1196
Epoch [    4/   10] | d_loss: 0.1989 | g_loss: 3.1283
Epoch [    4/   10] | d_loss: 0.3553 | g_loss: 5.4858
Epoch [    4/   10] | d_loss: 0.0943 | g_loss: 3.3820
Epoch [    4/   10] | d_loss: 0.0598 | g_loss: 5.1020
Epoch [    4/   10] | d_loss: 0.0377 | g_loss: 5.2004
Epoch [    4/   10] | d_loss: 0.1334 | g_loss: 5.7766
Epoch [    4/   10] | d_loss: 0.0889 | g_loss: 4.2416
Epoch [    4/   10] | d_loss: 0.0781 | g_loss: 3.0428
Epoch [    4/   10] | d_loss: 0.0512 | g_loss: 5.4746
Epoch [    4/   10] | d_loss: 0.0350 | g_loss: 3.8882
Epoch [    4/   10] | d_loss: 0.0233 | g_loss: 6.6030
Epoch [    4/   10] | d_loss: 0.1702 | g_loss: 4.3458
Epoch [    4/   10] | d_loss: 0.0361 | g_loss: 4.8055
Epoch [    4/   10] | d_loss: 0.0451 | g_loss: 4.8708
Epoch [    4/   10] | d_loss: 0.1352 | g_loss: 4.3974
Epoch [    4/   10] | d_loss: 0.1443 | g_loss: 5.7039
Epoch [    4/   10] | d_loss: 0.0213 | g_loss: 6.4604
Epoch [    4/   10] | d_loss: 0.1834 | g_loss: 3.1091
Epoch [    4/   10] | d_loss: 0.0981 | g_loss: 5.6786
Epoch [    4/   10] | d_loss: 0.0221 | g_loss: 5.3518
Epoch [    4/   10] | d_loss: 0.0869 | g_loss: 6.0882
Epoch [    4/   10] | d_loss: 0.2150 | g_loss: 2.8760
Epoch [    4/   10] | d_loss: 0.2867 | g_loss: 2.7408
Epoch [    4/   10] | d_loss: 0.3465 | g_loss: 4.3229
Epoch [    4/   10] | d_loss: 0.1350 | g_loss: 4.1848
Epoch [    4/   10] | d_loss: 0.1912 | g_loss: 4.4895
Epoch [    4/   10] | d_loss: 0.2162 | g_loss: 5.8869
Epoch [    4/   10] | d_loss: 0.0631 | g_loss: 2.9020
Epoch [    4/   10] | d_loss: 0.0577 | g_loss: 4.6169
Epoch [    4/   10] | d_loss: 0.2846 | g_loss: 3.8439
Epoch [    4/   10] | d_loss: 0.0464 | g_loss: 2.9028
Epoch [    4/   10] | d_loss: 0.1554 | g_loss: 4.4725
Epoch [    4/   10] | d_loss: 0.0272 | g_loss: 3.2833
Epoch [    4/   10] | d_loss: 0.0441 | g_loss: 3.4987
Epoch [    4/   10] | d_loss: 0.0455 | g_loss: 6.2005
Epoch [    4/   10] | d_loss: 0.0159 | g_loss: 6.2553
Epoch [    4/   10] | d_loss: 0.1105 | g_loss: 2.0225
Epoch [    4/   10] | d_loss: 0.1142 | g_loss: 7.8510
Epoch [    4/   10] | d_loss: 0.7074 | g_loss: 4.7792
Epoch [    4/   10] | d_loss: 0.2881 | g_loss: 6.6648
Epoch [    4/   10] | d_loss: 0.0368 | g_loss: 7.4783
Epoch [    4/   10] | d_loss: 0.0542 | g_loss: 5.3540
Epoch [    4/   10] | d_loss: 0.0203 | g_loss: 4.5352
Epoch [    5/   10] | d_loss: 3.5773 | g_loss: 6.3793
Epoch [    5/   10] | d_loss: 0.0373 | g_loss: 3.4269
Epoch [    5/   10] | d_loss: 0.0484 | g_loss: 3.3581
Epoch [    5/   10] | d_loss: 0.1233 | g_loss: 5.1916
Epoch [    5/   10] | d_loss: 0.0030 | g_loss: 6.2858
Epoch [    5/   10] | d_loss: 0.0360 | g_loss: 5.7591
Epoch [    5/   10] | d_loss: 0.1906 | g_loss: 5.1732
Epoch [    5/   10] | d_loss: 0.1306 | g_loss: 5.3450
Epoch [    5/   10] | d_loss: 0.2135 | g_loss: 3.6209
Epoch [    5/   10] | d_loss: 0.1424 | g_loss: 3.5814
Epoch [    5/   10] | d_loss: 0.1434 | g_loss: 3.7481
Epoch [    5/   10] | d_loss: 0.0122 | g_loss: 7.1511
Epoch [    5/   10] | d_loss: 0.4122 | g_loss: 3.0166
Epoch [    5/   10] | d_loss: 0.2802 | g_loss: 7.5718
Epoch [    5/   10] | d_loss: 0.0315 | g_loss: 5.6030
Epoch [    5/   10] | d_loss: 0.0227 | g_loss: 5.8589
Epoch [    5/   10] | d_loss: 0.1668 | g_loss: 4.5545
Epoch [    5/   10] | d_loss: 0.0172 | g_loss: 3.9632
Epoch [    5/   10] | d_loss: 0.0158 | g_loss: 4.5203
Epoch [    5/   10] | d_loss: 0.1170 | g_loss: 3.5879
Epoch [    5/   10] | d_loss: 0.0391 | g_loss: 5.0422
Epoch [    5/   10] | d_loss: 0.0711 | g_loss: 6.1403
Epoch [    5/   10] | d_loss: 0.2037 | g_loss: 5.5031
Epoch [    5/   10] | d_loss: 0.1110 | g_loss: 4.8571
Epoch [    5/   10] | d_loss: 0.0694 | g_loss: 7.3655
Epoch [    5/   10] | d_loss: 0.0815 | g_loss: 4.7039
Epoch [    5/   10] | d_loss: 0.0206 | g_loss: 5.6724
Epoch [    5/   10] | d_loss: 0.0351 | g_loss: 4.9640
Epoch [    5/   10] | d_loss: 0.0387 | g_loss: 4.7013
Epoch [    5/   10] | d_loss: 0.0273 | g_loss: 7.9245
Epoch [    5/   10] | d_loss: 0.0533 | g_loss: 4.6089
Epoch [    5/   10] | d_loss: 0.0324 | g_loss: 4.6704
Epoch [    5/   10] | d_loss: 0.0651 | g_loss: 4.2719
Epoch [    5/   10] | d_loss: 0.0128 | g_loss: 5.8807
Epoch [    5/   10] | d_loss: 0.0682 | g_loss: 6.9982
Epoch [    5/   10] | d_loss: 0.0220 | g_loss: 5.8981
Epoch [    5/   10] | d_loss: 0.0882 | g_loss: 5.0777
Epoch [    5/   10] | d_loss: 0.0537 | g_loss: 5.9011
Epoch [    5/   10] | d_loss: 0.1166 | g_loss: 6.8232
Epoch [    5/   10] | d_loss: 0.3380 | g_loss: 1.9488
Epoch [    5/   10] | d_loss: 0.3578 | g_loss: 4.6781
Epoch [    5/   10] | d_loss: 0.0428 | g_loss: 4.7134
Epoch [    5/   10] | d_loss: 0.1081 | g_loss: 4.7930
Epoch [    5/   10] | d_loss: 0.1467 | g_loss: 7.1403
Epoch [    5/   10] | d_loss: 0.0250 | g_loss: 5.2001
Epoch [    5/   10] | d_loss: 0.0118 | g_loss: 8.1984
Epoch [    5/   10] | d_loss: 0.1080 | g_loss: 4.9888
Epoch [    5/   10] | d_loss: 0.1591 | g_loss: 4.9015
Epoch [    5/   10] | d_loss: 0.0454 | g_loss: 7.6027
Epoch [    5/   10] | d_loss: 0.0756 | g_loss: 6.3448
Epoch [    5/   10] | d_loss: 0.0568 | g_loss: 4.8922
Epoch [    5/   10] | d_loss: 0.1075 | g_loss: 5.7676
Epoch [    5/   10] | d_loss: 0.0311 | g_loss: 7.6424
Epoch [    5/   10] | d_loss: 0.0717 | g_loss: 5.5440
Epoch [    5/   10] | d_loss: 0.1069 | g_loss: 6.3917
Epoch [    5/   10] | d_loss: 0.0079 | g_loss: 10.2744
Epoch [    5/   10] | d_loss: 0.1360 | g_loss: 6.0532
Epoch [    5/   10] | d_loss: 0.0148 | g_loss: 4.2096
Epoch [    5/   10] | d_loss: 0.0399 | g_loss: 3.8500
Epoch [    5/   10] | d_loss: 0.0289 | g_loss: 8.8097
Epoch [    5/   10] | d_loss: 0.0431 | g_loss: 4.5276
Epoch [    5/   10] | d_loss: 0.2041 | g_loss: 3.8989
Epoch [    5/   10] | d_loss: 0.0012 | g_loss: 9.2025
Epoch [    5/   10] | d_loss: 0.0892 | g_loss: 4.1728
Epoch [    5/   10] | d_loss: 0.0086 | g_loss: 5.2614
Epoch [    5/   10] | d_loss: 0.2995 | g_loss: 3.4389
Epoch [    5/   10] | d_loss: 0.1028 | g_loss: 5.5917
Epoch [    5/   10] | d_loss: 0.0209 | g_loss: 7.7883
Epoch [    5/   10] | d_loss: 0.0565 | g_loss: 5.5697
Epoch [    5/   10] | d_loss: 0.0209 | g_loss: 4.9086
Epoch [    5/   10] | d_loss: 0.4114 | g_loss: 6.9367
Epoch [    5/   10] | d_loss: 0.0196 | g_loss: 6.4139
Epoch [    5/   10] | d_loss: 0.1965 | g_loss: 3.3065
Epoch [    5/   10] | d_loss: 0.0210 | g_loss: 4.2352
Epoch [    5/   10] | d_loss: 0.1501 | g_loss: 4.3740
Epoch [    5/   10] | d_loss: 0.0051 | g_loss: 8.1234
Epoch [    5/   10] | d_loss: 0.2411 | g_loss: 3.8058
Epoch [    5/   10] | d_loss: 0.0905 | g_loss: 5.4416
Epoch [    5/   10] | d_loss: 0.0492 | g_loss: 5.2978
Epoch [    5/   10] | d_loss: 0.0152 | g_loss: 6.7604
Epoch [    5/   10] | d_loss: 0.0199 | g_loss: 7.0795
Epoch [    5/   10] | d_loss: 0.0773 | g_loss: 5.8535
Epoch [    5/   10] | d_loss: 0.0432 | g_loss: 7.1393
Epoch [    5/   10] | d_loss: 0.0281 | g_loss: 4.3629
Epoch [    5/   10] | d_loss: 0.0417 | g_loss: 5.5792
Epoch [    5/   10] | d_loss: 0.1368 | g_loss: 4.7739
Epoch [    5/   10] | d_loss: 0.0912 | g_loss: 2.5119
Epoch [    5/   10] | d_loss: 0.0469 | g_loss: 5.2958
Epoch [    5/   10] | d_loss: 0.1987 | g_loss: 4.5468
Epoch [    5/   10] | d_loss: 0.0807 | g_loss: 5.9589
Epoch [    5/   10] | d_loss: 0.0547 | g_loss: 7.2269
Epoch [    5/   10] | d_loss: 0.0961 | g_loss: 6.1417
Epoch [    5/   10] | d_loss: 0.2720 | g_loss: 3.6475
Epoch [    5/   10] | d_loss: 0.0171 | g_loss: 4.8526
Epoch [    5/   10] | d_loss: 0.0092 | g_loss: 5.2876
Epoch [    5/   10] | d_loss: 0.0279 | g_loss: 5.8767
Epoch [    5/   10] | d_loss: 0.2974 | g_loss: 3.7230
Epoch [    5/   10] | d_loss: 0.0579 | g_loss: 4.7992
Epoch [    5/   10] | d_loss: 0.1141 | g_loss: 5.9357
Epoch [    5/   10] | d_loss: 0.0882 | g_loss: 5.2159
Epoch [    5/   10] | d_loss: 0.0504 | g_loss: 4.2164
Epoch [    5/   10] | d_loss: 0.0698 | g_loss: 5.6311
Epoch [    5/   10] | d_loss: 0.1587 | g_loss: 4.2931
Epoch [    5/   10] | d_loss: 0.0623 | g_loss: 4.9180
Epoch [    5/   10] | d_loss: 0.0384 | g_loss: 4.0210
Epoch [    5/   10] | d_loss: 0.0077 | g_loss: 4.5526
Epoch [    5/   10] | d_loss: 0.0324 | g_loss: 7.2443
Epoch [    5/   10] | d_loss: 0.0428 | g_loss: 4.7300
Epoch [    5/   10] | d_loss: 0.0378 | g_loss: 0.6647
Epoch [    5/   10] | d_loss: 0.0328 | g_loss: 7.3405
Epoch [    5/   10] | d_loss: 0.2703 | g_loss: 1.0690
Epoch [    5/   10] | d_loss: 0.0523 | g_loss: 5.6021
Epoch [    5/   10] | d_loss: 0.0337 | g_loss: 5.6146
Epoch [    5/   10] | d_loss: 0.0048 | g_loss: 6.9912
Epoch [    5/   10] | d_loss: 0.0126 | g_loss: 5.5358
Epoch [    5/   10] | d_loss: 0.0202 | g_loss: 6.1616
Epoch [    5/   10] | d_loss: 0.0917 | g_loss: 5.5654
Epoch [    5/   10] | d_loss: 0.0363 | g_loss: 3.3519
Epoch [    5/   10] | d_loss: 0.2688 | g_loss: 2.1705
Epoch [    5/   10] | d_loss: 0.0339 | g_loss: 8.5611
Epoch [    5/   10] | d_loss: 0.0382 | g_loss: 4.8807
Epoch [    5/   10] | d_loss: 0.2368 | g_loss: 4.1897
Epoch [    5/   10] | d_loss: 0.0524 | g_loss: 6.2745
Epoch [    5/   10] | d_loss: 0.0372 | g_loss: 5.7752
Epoch [    5/   10] | d_loss: 0.0181 | g_loss: 6.2983
Epoch [    5/   10] | d_loss: 0.0449 | g_loss: 3.9979
Epoch [    5/   10] | d_loss: 0.0247 | g_loss: 5.0042
Epoch [    6/   10] | d_loss: 1.3504 | g_loss: 5.2092
Epoch [    6/   10] | d_loss: 0.0967 | g_loss: 4.4580
Epoch [    6/   10] | d_loss: 0.0704 | g_loss: 5.1942
Epoch [    6/   10] | d_loss: 0.0494 | g_loss: 6.5343
Epoch [    6/   10] | d_loss: 1.0549 | g_loss: 8.8985
Epoch [    6/   10] | d_loss: 0.0281 | g_loss: 5.3083
Epoch [    6/   10] | d_loss: 0.0810 | g_loss: 6.5831
Epoch [    6/   10] | d_loss: 0.1843 | g_loss: 7.2953
Epoch [    6/   10] | d_loss: 0.0053 | g_loss: 6.6968
Epoch [    6/   10] | d_loss: 0.1317 | g_loss: 5.0770
Epoch [    6/   10] | d_loss: 0.0881 | g_loss: 5.0002
Epoch [    6/   10] | d_loss: 0.1729 | g_loss: 6.0481
Epoch [    6/   10] | d_loss: 0.0971 | g_loss: 5.0767
Epoch [    6/   10] | d_loss: 0.0481 | g_loss: 5.6382
Epoch [    6/   10] | d_loss: 0.1012 | g_loss: 4.5548
Epoch [    6/   10] | d_loss: 0.1605 | g_loss: 7.7718
Epoch [    6/   10] | d_loss: 0.0686 | g_loss: 3.5174
Epoch [    6/   10] | d_loss: 0.3231 | g_loss: 4.6609
Epoch [    6/   10] | d_loss: 0.0566 | g_loss: 3.4789
Epoch [    6/   10] | d_loss: 0.3012 | g_loss: 7.8247
Epoch [    6/   10] | d_loss: 0.0219 | g_loss: 4.6429
Epoch [    6/   10] | d_loss: 0.0405 | g_loss: 6.1652
Epoch [    6/   10] | d_loss: 0.1450 | g_loss: 5.7504
Epoch [    6/   10] | d_loss: 0.0103 | g_loss: 6.8343
Epoch [    6/   10] | d_loss: 0.0374 | g_loss: 5.6065
Epoch [    6/   10] | d_loss: 0.0284 | g_loss: 7.0564
Epoch [    6/   10] | d_loss: 0.0209 | g_loss: 5.8654
Epoch [    6/   10] | d_loss: 0.0352 | g_loss: 3.4703
Epoch [    6/   10] | d_loss: 0.0155 | g_loss: 6.1997
Epoch [    6/   10] | d_loss: 0.0398 | g_loss: 7.3673
Epoch [    6/   10] | d_loss: 0.1351 | g_loss: 4.8880
Epoch [    6/   10] | d_loss: 0.0273 | g_loss: 7.7482
Epoch [    6/   10] | d_loss: 0.0024 | g_loss: 5.0869
Epoch [    6/   10] | d_loss: 0.0347 | g_loss: 2.4526
Epoch [    6/   10] | d_loss: 0.0823 | g_loss: 5.3086
Epoch [    6/   10] | d_loss: 0.0340 | g_loss: 4.5016
Epoch [    6/   10] | d_loss: 0.3389 | g_loss: 3.5265
Epoch [    6/   10] | d_loss: 0.0956 | g_loss: 3.7621
Epoch [    6/   10] | d_loss: 0.0108 | g_loss: 4.3432
Epoch [    6/   10] | d_loss: 0.0383 | g_loss: 5.9432
Epoch [    6/   10] | d_loss: 0.0467 | g_loss: 4.5026
Epoch [    6/   10] | d_loss: 0.5834 | g_loss: 0.4232
Epoch [    6/   10] | d_loss: 0.2894 | g_loss: 5.3285
Epoch [    6/   10] | d_loss: 0.3473 | g_loss: 5.8217
Epoch [    6/   10] | d_loss: 0.0322 | g_loss: 6.9752
Epoch [    6/   10] | d_loss: 0.0318 | g_loss: 5.6927
Epoch [    6/   10] | d_loss: 0.0743 | g_loss: 4.8890
Epoch [    6/   10] | d_loss: 0.0233 | g_loss: 5.1661
Epoch [    6/   10] | d_loss: 0.1502 | g_loss: 6.6335
Epoch [    6/   10] | d_loss: 0.0234 | g_loss: 5.7970
Epoch [    6/   10] | d_loss: 0.0181 | g_loss: 3.6690
Epoch [    6/   10] | d_loss: 0.0709 | g_loss: 5.8074
Epoch [    6/   10] | d_loss: 0.1317 | g_loss: 7.0961
Epoch [    6/   10] | d_loss: 0.2871 | g_loss: 2.9257
Epoch [    6/   10] | d_loss: 0.0418 | g_loss: 2.6079
Epoch [    6/   10] | d_loss: 0.0343 | g_loss: 5.8508
Epoch [    6/   10] | d_loss: 0.1440 | g_loss: 5.0931
Epoch [    6/   10] | d_loss: 0.0900 | g_loss: 4.2810
Epoch [    6/   10] | d_loss: 0.0194 | g_loss: 6.0121
Epoch [    6/   10] | d_loss: 0.0358 | g_loss: 5.4566
Epoch [    6/   10] | d_loss: 0.2036 | g_loss: 4.1264
Epoch [    6/   10] | d_loss: 0.0944 | g_loss: 4.4900
Epoch [    6/   10] | d_loss: 0.0375 | g_loss: 4.8069
Epoch [    6/   10] | d_loss: 0.0886 | g_loss: 3.1432
Epoch [    6/   10] | d_loss: 2.0842 | g_loss: 0.2315
Epoch [    6/   10] | d_loss: 0.1520 | g_loss: 3.5520
Epoch [    6/   10] | d_loss: 0.0298 | g_loss: 4.4429
Epoch [    6/   10] | d_loss: 0.0683 | g_loss: 5.0840
Epoch [    6/   10] | d_loss: 0.1968 | g_loss: 7.5697
Epoch [    6/   10] | d_loss: 0.0250 | g_loss: 4.2766
Epoch [    6/   10] | d_loss: 0.0371 | g_loss: 6.0207
Epoch [    6/   10] | d_loss: 0.0026 | g_loss: 6.1153
Epoch [    6/   10] | d_loss: 0.0609 | g_loss: 4.8196
Epoch [    6/   10] | d_loss: 0.0363 | g_loss: 5.9188
Epoch [    6/   10] | d_loss: 0.0590 | g_loss: 2.1251
Epoch [    6/   10] | d_loss: 0.0486 | g_loss: 6.6291
Epoch [    6/   10] | d_loss: 0.0540 | g_loss: 4.7592
Epoch [    6/   10] | d_loss: 0.1936 | g_loss: 4.0321
Epoch [    6/   10] | d_loss: 0.0806 | g_loss: 6.6368
Epoch [    6/   10] | d_loss: 0.2938 | g_loss: 6.8464
Epoch [    6/   10] | d_loss: 0.0883 | g_loss: 5.1124
Epoch [    6/   10] | d_loss: 0.0432 | g_loss: 4.8599
Epoch [    6/   10] | d_loss: 0.0384 | g_loss: 6.7913
Epoch [    6/   10] | d_loss: 0.1183 | g_loss: 4.9564
Epoch [    6/   10] | d_loss: 0.1781 | g_loss: 3.7874
Epoch [    6/   10] | d_loss: 0.0589 | g_loss: 3.2318
Epoch [    6/   10] | d_loss: 0.1931 | g_loss: 6.1459
Epoch [    6/   10] | d_loss: 0.0313 | g_loss: 5.7105
Epoch [    6/   10] | d_loss: 0.0238 | g_loss: 5.1211
Epoch [    6/   10] | d_loss: 0.0753 | g_loss: 3.9393
Epoch [    6/   10] | d_loss: 0.0203 | g_loss: 6.8099
Epoch [    6/   10] | d_loss: 0.0092 | g_loss: 5.1037
Epoch [    6/   10] | d_loss: 0.0074 | g_loss: 6.4503
Epoch [    6/   10] | d_loss: 0.1031 | g_loss: 4.2924
Epoch [    6/   10] | d_loss: 0.0872 | g_loss: 4.9896
Epoch [    6/   10] | d_loss: 0.0437 | g_loss: 4.9815
Epoch [    6/   10] | d_loss: 0.0090 | g_loss: 8.7437
Epoch [    6/   10] | d_loss: 0.0305 | g_loss: 2.9188
Epoch [    6/   10] | d_loss: 0.0386 | g_loss: 4.5169
Epoch [    6/   10] | d_loss: 0.3468 | g_loss: 4.4704
Epoch [    6/   10] | d_loss: 0.0523 | g_loss: 5.4251
Epoch [    6/   10] | d_loss: 0.0736 | g_loss: 5.7388
Epoch [    6/   10] | d_loss: 0.0303 | g_loss: 5.3891
Epoch [    6/   10] | d_loss: 0.0085 | g_loss: 5.4954
Epoch [    6/   10] | d_loss: 0.0857 | g_loss: 6.0356
Epoch [    6/   10] | d_loss: 0.1557 | g_loss: 5.5752
Epoch [    6/   10] | d_loss: 0.0201 | g_loss: 4.7563
Epoch [    6/   10] | d_loss: 0.0768 | g_loss: 6.0789
Epoch [    6/   10] | d_loss: 0.0345 | g_loss: 5.2083
Epoch [    6/   10] | d_loss: 0.1100 | g_loss: 3.9668
Epoch [    6/   10] | d_loss: 0.1889 | g_loss: 5.2135
Epoch [    6/   10] | d_loss: 0.6633 | g_loss: 5.0188
Epoch [    6/   10] | d_loss: 0.0462 | g_loss: 4.2214
Epoch [    6/   10] | d_loss: 0.0120 | g_loss: 7.3276
Epoch [    6/   10] | d_loss: 0.0632 | g_loss: 4.0406
Epoch [    6/   10] | d_loss: 0.2337 | g_loss: 6.7495
Epoch [    6/   10] | d_loss: 0.0605 | g_loss: 5.0244
Epoch [    6/   10] | d_loss: 0.0483 | g_loss: 5.0503
Epoch [    6/   10] | d_loss: 0.1927 | g_loss: 5.9762
Epoch [    6/   10] | d_loss: 0.1580 | g_loss: 6.7357
Epoch [    6/   10] | d_loss: 0.1041 | g_loss: 4.7363
Epoch [    6/   10] | d_loss: 0.0180 | g_loss: 6.4309
Epoch [    6/   10] | d_loss: 0.1041 | g_loss: 3.7131
Epoch [    6/   10] | d_loss: 0.0753 | g_loss: 2.8192
Epoch [    6/   10] | d_loss: 0.0509 | g_loss: 7.6912
Epoch [    6/   10] | d_loss: 0.0558 | g_loss: 7.4569
Epoch [    6/   10] | d_loss: 0.5383 | g_loss: 3.5095
Epoch [    7/   10] | d_loss: 1.6473 | g_loss: 7.8214
Epoch [    7/   10] | d_loss: 0.1554 | g_loss: 6.1965
Epoch [    7/   10] | d_loss: 0.0254 | g_loss: 7.0720
Epoch [    7/   10] | d_loss: 0.1175 | g_loss: 4.8196
Epoch [    7/   10] | d_loss: 0.0256 | g_loss: 2.7915
Epoch [    7/   10] | d_loss: 0.0251 | g_loss: 8.5433
Epoch [    7/   10] | d_loss: 0.0442 | g_loss: 8.0478
Epoch [    7/   10] | d_loss: 0.0342 | g_loss: 4.9820
Epoch [    7/   10] | d_loss: 0.1316 | g_loss: 6.5560
Epoch [    7/   10] | d_loss: 0.0994 | g_loss: 5.6557
Epoch [    7/   10] | d_loss: 0.0529 | g_loss: 4.0527
Epoch [    7/   10] | d_loss: 0.0206 | g_loss: 7.1691
Epoch [    7/   10] | d_loss: 0.0230 | g_loss: 6.6652
Epoch [    7/   10] | d_loss: 0.0528 | g_loss: 5.2067
Epoch [    7/   10] | d_loss: 0.1712 | g_loss: 6.8116
Epoch [    7/   10] | d_loss: 0.0510 | g_loss: 3.3985
Epoch [    7/   10] | d_loss: 0.0042 | g_loss: 6.3464
Epoch [    7/   10] | d_loss: 0.0893 | g_loss: 4.3686
Epoch [    7/   10] | d_loss: 0.0290 | g_loss: 5.3994
Epoch [    7/   10] | d_loss: 0.0808 | g_loss: 6.5060
Epoch [    7/   10] | d_loss: 0.0609 | g_loss: 3.7493
Epoch [    7/   10] | d_loss: 0.1672 | g_loss: 6.9770
Epoch [    7/   10] | d_loss: 0.1799 | g_loss: 4.2732
Epoch [    7/   10] | d_loss: 0.0842 | g_loss: 4.9445
Epoch [    7/   10] | d_loss: 0.3635 | g_loss: 2.7466
Epoch [    7/   10] | d_loss: 0.0190 | g_loss: 4.6841
Epoch [    7/   10] | d_loss: 0.0079 | g_loss: 7.0080
Epoch [    7/   10] | d_loss: 0.0428 | g_loss: 5.4222
Epoch [    7/   10] | d_loss: 0.2687 | g_loss: 2.4581
Epoch [    7/   10] | d_loss: 0.2428 | g_loss: 4.7961
Epoch [    7/   10] | d_loss: 0.0814 | g_loss: 6.6467
Epoch [    7/   10] | d_loss: 0.0221 | g_loss: 5.7515
Epoch [    7/   10] | d_loss: 0.3040 | g_loss: 6.0041
Epoch [    7/   10] | d_loss: 0.0474 | g_loss: 5.0013
Epoch [    7/   10] | d_loss: 0.0253 | g_loss: 5.6630
Epoch [    7/   10] | d_loss: 0.1557 | g_loss: 4.1700
Epoch [    7/   10] | d_loss: 0.0159 | g_loss: 5.6011
Epoch [    7/   10] | d_loss: 0.0052 | g_loss: 5.9203
Epoch [    7/   10] | d_loss: 0.1307 | g_loss: 6.5701
Epoch [    7/   10] | d_loss: 0.0110 | g_loss: 4.8606
Epoch [    7/   10] | d_loss: 0.0292 | g_loss: 5.6682
Epoch [    7/   10] | d_loss: 0.1491 | g_loss: 4.8687
Epoch [    7/   10] | d_loss: 0.0634 | g_loss: 4.0287
Epoch [    7/   10] | d_loss: 0.0610 | g_loss: 5.3933
Epoch [    7/   10] | d_loss: 0.0815 | g_loss: 5.4900
Epoch [    7/   10] | d_loss: 0.0565 | g_loss: 4.5226
Epoch [    7/   10] | d_loss: 0.0086 | g_loss: 4.3966
Epoch [    7/   10] | d_loss: 0.0529 | g_loss: 5.0404
Epoch [    7/   10] | d_loss: 0.0236 | g_loss: 3.9375
Epoch [    7/   10] | d_loss: 0.0118 | g_loss: 6.9910
Epoch [    7/   10] | d_loss: 0.0054 | g_loss: 6.9389
Epoch [    7/   10] | d_loss: 0.3944 | g_loss: 5.4857
Epoch [    7/   10] | d_loss: 0.0295 | g_loss: 5.9461
Epoch [    7/   10] | d_loss: 0.0457 | g_loss: 5.4852
Epoch [    7/   10] | d_loss: 0.0084 | g_loss: 6.5139
Epoch [    7/   10] | d_loss: 0.0371 | g_loss: 5.6691
Epoch [    7/   10] | d_loss: 0.2058 | g_loss: 6.3953
Epoch [    7/   10] | d_loss: 0.1754 | g_loss: 6.0495
Epoch [    7/   10] | d_loss: 0.0224 | g_loss: 6.4810
Epoch [    7/   10] | d_loss: 0.2182 | g_loss: 4.0378
Epoch [    7/   10] | d_loss: 0.0737 | g_loss: 2.4657
Epoch [    7/   10] | d_loss: 0.0725 | g_loss: 4.9562
Epoch [    7/   10] | d_loss: 0.0283 | g_loss: 6.3838
Epoch [    7/   10] | d_loss: 0.0301 | g_loss: 6.3615
Epoch [    7/   10] | d_loss: 0.0931 | g_loss: 4.3573
Epoch [    7/   10] | d_loss: 0.2551 | g_loss: 4.0330
Epoch [    7/   10] | d_loss: 0.1463 | g_loss: 1.9966
Epoch [    7/   10] | d_loss: 0.4471 | g_loss: 4.1224
Epoch [    7/   10] | d_loss: 0.1341 | g_loss: 5.2130
Epoch [    7/   10] | d_loss: 0.2690 | g_loss: 4.4226
Epoch [    7/   10] | d_loss: 0.0400 | g_loss: 4.4027
Epoch [    7/   10] | d_loss: 0.1629 | g_loss: 4.8976
Epoch [    7/   10] | d_loss: 0.0690 | g_loss: 5.1245
Epoch [    7/   10] | d_loss: 0.0390 | g_loss: 3.5197
Epoch [    7/   10] | d_loss: 0.0198 | g_loss: 5.6080
Epoch [    7/   10] | d_loss: 0.0214 | g_loss: 6.9677
Epoch [    7/   10] | d_loss: 0.0333 | g_loss: 7.5028
Epoch [    7/   10] | d_loss: 0.1220 | g_loss: 4.5276
Epoch [    7/   10] | d_loss: 0.0706 | g_loss: 4.1480
Epoch [    7/   10] | d_loss: 0.0823 | g_loss: 2.6276
Epoch [    7/   10] | d_loss: 0.3741 | g_loss: 4.8824
Epoch [    7/   10] | d_loss: 0.0416 | g_loss: 5.7685
Epoch [    7/   10] | d_loss: 0.0609 | g_loss: 4.0105
Epoch [    7/   10] | d_loss: 0.0183 | g_loss: 6.4889
Epoch [    7/   10] | d_loss: 0.3560 | g_loss: 4.7258
Epoch [    7/   10] | d_loss: 0.0117 | g_loss: 6.9586
Epoch [    7/   10] | d_loss: 0.1741 | g_loss: 3.3472
Epoch [    7/   10] | d_loss: 0.1410 | g_loss: 7.4380
Epoch [    7/   10] | d_loss: 0.2095 | g_loss: 5.3424
Epoch [    7/   10] | d_loss: 0.0368 | g_loss: 4.7740
Epoch [    7/   10] | d_loss: 0.0728 | g_loss: 5.3982
Epoch [    7/   10] | d_loss: 0.0057 | g_loss: 5.0628
Epoch [    7/   10] | d_loss: 0.2440 | g_loss: 4.5531
Epoch [    7/   10] | d_loss: 0.1998 | g_loss: 5.0893
Epoch [    7/   10] | d_loss: 0.0392 | g_loss: 6.7257
Epoch [    7/   10] | d_loss: 0.0304 | g_loss: 6.3495
Epoch [    7/   10] | d_loss: 0.0696 | g_loss: 4.9323
Epoch [    7/   10] | d_loss: 0.0122 | g_loss: 6.6647
Epoch [    7/   10] | d_loss: 0.0101 | g_loss: 5.9469
Epoch [    7/   10] | d_loss: 0.0816 | g_loss: 0.9639
Epoch [    7/   10] | d_loss: 0.0373 | g_loss: 4.9750
Epoch [    7/   10] | d_loss: 1.0405 | g_loss: 0.2970
Epoch [    7/   10] | d_loss: 0.4559 | g_loss: 3.3031
Epoch [    7/   10] | d_loss: 0.0133 | g_loss: 5.2444
Epoch [    7/   10] | d_loss: 0.1577 | g_loss: 5.5143
Epoch [    7/   10] | d_loss: 0.0622 | g_loss: 6.4350
Epoch [    7/   10] | d_loss: 0.0302 | g_loss: 7.3159
Epoch [    7/   10] | d_loss: 0.2042 | g_loss: 3.9353
Epoch [    7/   10] | d_loss: 0.0985 | g_loss: 5.7357
Epoch [    7/   10] | d_loss: 0.0110 | g_loss: 5.4135
Epoch [    7/   10] | d_loss: 0.1891 | g_loss: 4.4480
Epoch [    7/   10] | d_loss: 0.1511 | g_loss: 2.1671
Epoch [    7/   10] | d_loss: 0.0917 | g_loss: 7.6413
Epoch [    7/   10] | d_loss: 0.0682 | g_loss: 7.2109
Epoch [    7/   10] | d_loss: 0.0217 | g_loss: 3.6112
Epoch [    7/   10] | d_loss: 0.1801 | g_loss: 3.6871
Epoch [    7/   10] | d_loss: 0.0332 | g_loss: 6.6910
Epoch [    7/   10] | d_loss: 0.0085 | g_loss: 7.1514
Epoch [    7/   10] | d_loss: 0.0382 | g_loss: 4.4058
Epoch [    7/   10] | d_loss: 0.0384 | g_loss: 3.0679
Epoch [    7/   10] | d_loss: 0.0272 | g_loss: 6.3022
Epoch [    7/   10] | d_loss: 0.0028 | g_loss: 6.6370
Epoch [    7/   10] | d_loss: 0.0156 | g_loss: 4.3315
Epoch [    7/   10] | d_loss: 0.0184 | g_loss: 5.7650
Epoch [    7/   10] | d_loss: 0.0059 | g_loss: 7.5089
Epoch [    7/   10] | d_loss: 0.0632 | g_loss: 2.7578
Epoch [    7/   10] | d_loss: 0.1174 | g_loss: 2.6507
Epoch [    8/   10] | d_loss: 0.0255 | g_loss: 4.4700
Epoch [    8/   10] | d_loss: 0.1833 | g_loss: 6.3185
Epoch [    8/   10] | d_loss: 0.1493 | g_loss: 3.2712
Epoch [    8/   10] | d_loss: 0.1466 | g_loss: 6.2052
Epoch [    8/   10] | d_loss: 0.0163 | g_loss: 5.0812
Epoch [    8/   10] | d_loss: 0.1333 | g_loss: 5.1327
Epoch [    8/   10] | d_loss: 0.4070 | g_loss: 6.6424
Epoch [    8/   10] | d_loss: 0.0039 | g_loss: 6.0410
Epoch [    8/   10] | d_loss: 0.0068 | g_loss: 5.8721
Epoch [    8/   10] | d_loss: 0.1316 | g_loss: 3.6450
Epoch [    8/   10] | d_loss: 0.1426 | g_loss: 3.2857
Epoch [    8/   10] | d_loss: 0.0053 | g_loss: 5.1548
Epoch [    8/   10] | d_loss: 0.0730 | g_loss: 4.0392
Epoch [    8/   10] | d_loss: 0.0553 | g_loss: 4.7315
Epoch [    8/   10] | d_loss: 0.1162 | g_loss: 5.3679
Epoch [    8/   10] | d_loss: 0.0156 | g_loss: 6.3753
Epoch [    8/   10] | d_loss: 0.0533 | g_loss: 5.8610
Epoch [    8/   10] | d_loss: 0.0224 | g_loss: 5.8134
Epoch [    8/   10] | d_loss: 0.0206 | g_loss: 7.6801
Epoch [    8/   10] | d_loss: 0.0539 | g_loss: 8.8607
Epoch [    8/   10] | d_loss: 0.0207 | g_loss: 5.2763
Epoch [    8/   10] | d_loss: 0.0836 | g_loss: 3.7914
Epoch [    8/   10] | d_loss: 0.0834 | g_loss: 4.5310
Epoch [    8/   10] | d_loss: 0.1189 | g_loss: 6.6212
Epoch [    8/   10] | d_loss: 0.0500 | g_loss: 6.3545
Epoch [    8/   10] | d_loss: 0.5715 | g_loss: 7.8317
Epoch [    8/   10] | d_loss: 0.0302 | g_loss: 1.8886
Epoch [    8/   10] | d_loss: 0.0710 | g_loss: 4.0735
Epoch [    8/   10] | d_loss: 0.0178 | g_loss: 4.9889
Epoch [    8/   10] | d_loss: 0.0440 | g_loss: 5.0695
Epoch [    8/   10] | d_loss: 0.0272 | g_loss: 7.3339
Epoch [    8/   10] | d_loss: 0.0695 | g_loss: 4.7181
Epoch [    8/   10] | d_loss: 0.0685 | g_loss: 2.3723
Epoch [    8/   10] | d_loss: 0.3476 | g_loss: 5.9768
Epoch [    8/   10] | d_loss: 0.0664 | g_loss: 9.5245
Epoch [    8/   10] | d_loss: 0.7458 | g_loss: 2.9843
Epoch [    8/   10] | d_loss: 0.1525 | g_loss: 4.2088
Epoch [    8/   10] | d_loss: 0.6091 | g_loss: 4.8925
Epoch [    8/   10] | d_loss: 0.0083 | g_loss: 5.0059
Epoch [    8/   10] | d_loss: 0.8516 | g_loss: 3.4184
Epoch [    8/   10] | d_loss: 0.0254 | g_loss: 5.2354
Epoch [    8/   10] | d_loss: 0.0066 | g_loss: 7.0028
Epoch [    8/   10] | d_loss: 0.1347 | g_loss: 3.3192
Epoch [    8/   10] | d_loss: 0.0255 | g_loss: 5.7410
Epoch [    8/   10] | d_loss: 0.1707 | g_loss: 5.8746
Epoch [    8/   10] | d_loss: 0.0096 | g_loss: 7.4579
Epoch [    8/   10] | d_loss: 0.2882 | g_loss: 4.3399
Epoch [    8/   10] | d_loss: 0.0368 | g_loss: 6.7115
Epoch [    8/   10] | d_loss: 0.0499 | g_loss: 4.0559
Epoch [    8/   10] | d_loss: 0.0139 | g_loss: 5.9457
Epoch [    8/   10] | d_loss: 0.1163 | g_loss: 5.8962
Epoch [    8/   10] | d_loss: 0.0113 | g_loss: 6.9711
Epoch [    8/   10] | d_loss: 0.0136 | g_loss: 6.4400
Epoch [    8/   10] | d_loss: 0.0231 | g_loss: 1.0123
Epoch [    8/   10] | d_loss: 0.0822 | g_loss: 4.7611
Epoch [    8/   10] | d_loss: 0.0397 | g_loss: 6.2215
Epoch [    8/   10] | d_loss: 0.0405 | g_loss: 4.0447
Epoch [    8/   10] | d_loss: 0.0905 | g_loss: 4.3321
Epoch [    8/   10] | d_loss: 0.0190 | g_loss: 5.8017
Epoch [    8/   10] | d_loss: 0.0423 | g_loss: 4.5971
Epoch [    8/   10] | d_loss: 0.0325 | g_loss: 5.2521
Epoch [    8/   10] | d_loss: 0.0844 | g_loss: 4.1918
Epoch [    8/   10] | d_loss: 0.0162 | g_loss: 5.9669
Epoch [    8/   10] | d_loss: 0.2468 | g_loss: 7.6100
Epoch [    8/   10] | d_loss: 0.0186 | g_loss: 7.3959
Epoch [    8/   10] | d_loss: 0.0136 | g_loss: 4.5906
Epoch [    8/   10] | d_loss: 0.0459 | g_loss: 4.6870
Epoch [    8/   10] | d_loss: 0.1220 | g_loss: 4.2999
Epoch [    8/   10] | d_loss: 0.0191 | g_loss: 7.4175
Epoch [    8/   10] | d_loss: 0.1601 | g_loss: 2.5560
Epoch [    8/   10] | d_loss: 0.2125 | g_loss: 3.8338
Epoch [    8/   10] | d_loss: 0.1395 | g_loss: 4.3098
Epoch [    8/   10] | d_loss: 0.0298 | g_loss: 7.8283
Epoch [    8/   10] | d_loss: 0.0721 | g_loss: 5.0348
Epoch [    8/   10] | d_loss: 0.4392 | g_loss: 6.1592
Epoch [    8/   10] | d_loss: 0.0372 | g_loss: 5.6886
Epoch [    8/   10] | d_loss: 0.0213 | g_loss: 3.6982
Epoch [    8/   10] | d_loss: 0.1700 | g_loss: 5.9085
Epoch [    8/   10] | d_loss: 0.1106 | g_loss: 6.1745
Epoch [    8/   10] | d_loss: 3.7391 | g_loss: 4.8029
Epoch [    8/   10] | d_loss: 0.0400 | g_loss: 5.1491
Epoch [    8/   10] | d_loss: 0.0313 | g_loss: 4.2013
Epoch [    8/   10] | d_loss: 0.0500 | g_loss: 7.0942
Epoch [    8/   10] | d_loss: 0.0281 | g_loss: 5.1959
Epoch [    8/   10] | d_loss: 0.0344 | g_loss: 3.7636
Epoch [    8/   10] | d_loss: 0.1371 | g_loss: 3.6708
Epoch [    8/   10] | d_loss: 0.0094 | g_loss: 6.1427
Epoch [    8/   10] | d_loss: 0.0235 | g_loss: 4.2939
Epoch [    8/   10] | d_loss: 0.1395 | g_loss: 6.0291
Epoch [    8/   10] | d_loss: 0.0135 | g_loss: 4.1284
Epoch [    8/   10] | d_loss: 0.0703 | g_loss: 5.7127
Epoch [    8/   10] | d_loss: 0.1673 | g_loss: 5.9821
Epoch [    8/   10] | d_loss: 0.0062 | g_loss: 6.3907
Epoch [    8/   10] | d_loss: 0.0326 | g_loss: 3.8301
Epoch [    8/   10] | d_loss: 0.1591 | g_loss: 5.7797
Epoch [    8/   10] | d_loss: 0.0235 | g_loss: 5.2969
Epoch [    8/   10] | d_loss: 0.4758 | g_loss: 2.8859
Epoch [    8/   10] | d_loss: 0.1209 | g_loss: 7.1071
Epoch [    8/   10] | d_loss: 0.7784 | g_loss: 10.2048
Epoch [    8/   10] | d_loss: 0.5407 | g_loss: 4.7023
Epoch [    8/   10] | d_loss: 0.1225 | g_loss: 4.4314
Epoch [    8/   10] | d_loss: 0.0870 | g_loss: 7.1316
Epoch [    8/   10] | d_loss: 0.0998 | g_loss: 4.5032
Epoch [    8/   10] | d_loss: 0.3577 | g_loss: 3.7924
Epoch [    8/   10] | d_loss: 0.0399 | g_loss: 3.3656
Epoch [    8/   10] | d_loss: 0.7832 | g_loss: 5.2319
Epoch [    8/   10] | d_loss: 0.3907 | g_loss: 2.3102
Epoch [    8/   10] | d_loss: 0.0530 | g_loss: 3.3862
Epoch [    8/   10] | d_loss: 0.0661 | g_loss: 2.5648
Epoch [    8/   10] | d_loss: 0.0615 | g_loss: 4.2359
Epoch [    8/   10] | d_loss: 0.8196 | g_loss: 1.5699
Epoch [    8/   10] | d_loss: 0.0444 | g_loss: 8.3972
Epoch [    8/   10] | d_loss: 0.0207 | g_loss: 3.5407
Epoch [    8/   10] | d_loss: 0.0144 | g_loss: 7.2849
Epoch [    8/   10] | d_loss: 0.0153 | g_loss: 6.3157
Epoch [    8/   10] | d_loss: 0.5296 | g_loss: 6.8298
Epoch [    8/   10] | d_loss: 0.0474 | g_loss: 4.5283
Epoch [    8/   10] | d_loss: 0.0414 | g_loss: 7.0463
Epoch [    8/   10] | d_loss: 0.0270 | g_loss: 5.1393
Epoch [    8/   10] | d_loss: 0.0550 | g_loss: 6.2016
Epoch [    8/   10] | d_loss: 0.2279 | g_loss: 5.2161
Epoch [    8/   10] | d_loss: 0.0320 | g_loss: 6.7687
Epoch [    8/   10] | d_loss: 0.0171 | g_loss: 4.8691
Epoch [    8/   10] | d_loss: 0.0334 | g_loss: 4.4441
Epoch [    8/   10] | d_loss: 0.2039 | g_loss: 3.7041
Epoch [    8/   10] | d_loss: 0.1183 | g_loss: 7.0692
Epoch [    8/   10] | d_loss: 0.1196 | g_loss: 5.9173
Epoch [    9/   10] | d_loss: 3.6710 | g_loss: 1.6685
Epoch [    9/   10] | d_loss: 0.0266 | g_loss: 4.2748
Epoch [    9/   10] | d_loss: 0.1463 | g_loss: 4.2225
Epoch [    9/   10] | d_loss: 0.3722 | g_loss: 3.4752
Epoch [    9/   10] | d_loss: 0.0249 | g_loss: 5.3020
Epoch [    9/   10] | d_loss: 0.0329 | g_loss: 3.2249
Epoch [    9/   10] | d_loss: 0.0677 | g_loss: 6.6001
Epoch [    9/   10] | d_loss: 0.0248 | g_loss: 8.2657
Epoch [    9/   10] | d_loss: 0.0045 | g_loss: 5.5339
Epoch [    9/   10] | d_loss: 0.0097 | g_loss: 3.8787
Epoch [    9/   10] | d_loss: 0.0073 | g_loss: 6.5555
Epoch [    9/   10] | d_loss: 0.0166 | g_loss: 4.5051
Epoch [    9/   10] | d_loss: 0.0240 | g_loss: 3.9440
Epoch [    9/   10] | d_loss: 0.2853 | g_loss: 1.8623
Epoch [    9/   10] | d_loss: 0.0999 | g_loss: 6.6326
Epoch [    9/   10] | d_loss: 0.0306 | g_loss: 6.7000
Epoch [    9/   10] | d_loss: 0.0704 | g_loss: 6.8177
Epoch [    9/   10] | d_loss: 0.0221 | g_loss: 3.8886
Epoch [    9/   10] | d_loss: 0.0254 | g_loss: 5.8313
Epoch [    9/   10] | d_loss: 0.0562 | g_loss: 6.8763
Epoch [    9/   10] | d_loss: 0.0098 | g_loss: 8.0890
Epoch [    9/   10] | d_loss: 0.0543 | g_loss: 4.3469
Epoch [    9/   10] | d_loss: 0.0881 | g_loss: 5.4972
Epoch [    9/   10] | d_loss: 0.0103 | g_loss: 6.0187
Epoch [    9/   10] | d_loss: 0.1352 | g_loss: 5.2190
Epoch [    9/   10] | d_loss: 0.0511 | g_loss: 4.2719
Epoch [    9/   10] | d_loss: 0.0387 | g_loss: 6.7140
Epoch [    9/   10] | d_loss: 0.1529 | g_loss: 5.9191
Epoch [    9/   10] | d_loss: 0.0326 | g_loss: 5.1163
Epoch [    9/   10] | d_loss: 0.0084 | g_loss: 4.0485
Epoch [    9/   10] | d_loss: 0.0516 | g_loss: 6.6843
Epoch [    9/   10] | d_loss: 0.2604 | g_loss: 5.6436
Epoch [    9/   10] | d_loss: 0.0613 | g_loss: 2.4600
Epoch [    9/   10] | d_loss: 0.0995 | g_loss: 6.2161
Epoch [    9/   10] | d_loss: 0.0129 | g_loss: 6.7905
Epoch [    9/   10] | d_loss: 0.0248 | g_loss: 5.8314
Epoch [    9/   10] | d_loss: 0.0260 | g_loss: 5.1360
Epoch [    9/   10] | d_loss: 0.0069 | g_loss: 6.1606
Epoch [    9/   10] | d_loss: 0.0738 | g_loss: 6.0924
Epoch [    9/   10] | d_loss: 0.0429 | g_loss: 5.9244
Epoch [    9/   10] | d_loss: 0.0704 | g_loss: 8.6120
Epoch [    9/   10] | d_loss: 0.0605 | g_loss: 4.5297
Epoch [    9/   10] | d_loss: 0.0632 | g_loss: 5.0572
Epoch [    9/   10] | d_loss: 0.0421 | g_loss: 3.4610
Epoch [    9/   10] | d_loss: 0.1180 | g_loss: 5.1381
Epoch [    9/   10] | d_loss: 0.0463 | g_loss: 8.5484
Epoch [    9/   10] | d_loss: 0.1533 | g_loss: 4.1362
Epoch [    9/   10] | d_loss: 0.0353 | g_loss: 4.8587
Epoch [    9/   10] | d_loss: 0.0476 | g_loss: 6.1762
Epoch [    9/   10] | d_loss: 0.0432 | g_loss: 6.3163
Epoch [    9/   10] | d_loss: 0.0670 | g_loss: 8.3754
Epoch [    9/   10] | d_loss: 0.0466 | g_loss: 4.3589
Epoch [    9/   10] | d_loss: 0.0204 | g_loss: 5.8545
Epoch [    9/   10] | d_loss: 0.0473 | g_loss: 6.5521
Epoch [    9/   10] | d_loss: 0.0697 | g_loss: 2.2675
Epoch [    9/   10] | d_loss: 0.1933 | g_loss: 6.4800
Epoch [    9/   10] | d_loss: 0.2825 | g_loss: 3.8047
Epoch [    9/   10] | d_loss: 0.0534 | g_loss: 6.1858
Epoch [    9/   10] | d_loss: 0.0313 | g_loss: 5.1354
Epoch [    9/   10] | d_loss: 0.1117 | g_loss: 4.5453
Epoch [    9/   10] | d_loss: 0.0265 | g_loss: 5.7632
Epoch [    9/   10] | d_loss: 0.0137 | g_loss: 5.8550
Epoch [    9/   10] | d_loss: 0.0464 | g_loss: 4.7759
Epoch [    9/   10] | d_loss: 0.0056 | g_loss: 2.1638
Epoch [    9/   10] | d_loss: 0.0380 | g_loss: 4.3216
Epoch [    9/   10] | d_loss: 0.1297 | g_loss: 4.0653
Epoch [    9/   10] | d_loss: 0.0245 | g_loss: 3.2897
Epoch [    9/   10] | d_loss: 0.3666 | g_loss: 5.8167
Epoch [    9/   10] | d_loss: 0.0446 | g_loss: 4.5231
Epoch [    9/   10] | d_loss: 0.0163 | g_loss: 4.4107
Epoch [    9/   10] | d_loss: 0.0601 | g_loss: 6.6622
Epoch [    9/   10] | d_loss: 0.2707 | g_loss: 3.7845
Epoch [    9/   10] | d_loss: 0.0102 | g_loss: 5.9280
Epoch [    9/   10] | d_loss: 0.0288 | g_loss: 6.5267
Epoch [    9/   10] | d_loss: 0.0780 | g_loss: 8.0353
Epoch [    9/   10] | d_loss: 0.0608 | g_loss: 6.2143
Epoch [    9/   10] | d_loss: 0.0073 | g_loss: 6.9533
Epoch [    9/   10] | d_loss: 0.0350 | g_loss: 4.0633
Epoch [    9/   10] | d_loss: 0.0311 | g_loss: 1.5901
Epoch [    9/   10] | d_loss: 0.0344 | g_loss: 5.4771
Epoch [    9/   10] | d_loss: 0.0469 | g_loss: 4.7766
Epoch [    9/   10] | d_loss: 0.0095 | g_loss: 7.0114
Epoch [    9/   10] | d_loss: 0.0171 | g_loss: 7.2034
Epoch [    9/   10] | d_loss: 0.4573 | g_loss: 8.4226
Epoch [    9/   10] | d_loss: 0.1512 | g_loss: 7.2870
Epoch [    9/   10] | d_loss: 0.0404 | g_loss: 6.0515
Epoch [    9/   10] | d_loss: 0.0546 | g_loss: 5.1039
Epoch [    9/   10] | d_loss: 0.0097 | g_loss: 7.5527
Epoch [    9/   10] | d_loss: 0.0070 | g_loss: 6.2017
Epoch [    9/   10] | d_loss: 0.0104 | g_loss: 7.6315
Epoch [    9/   10] | d_loss: 0.0210 | g_loss: 6.1219
Epoch [    9/   10] | d_loss: 0.0348 | g_loss: 6.6687
Epoch [    9/   10] | d_loss: 0.0531 | g_loss: 5.6772
Epoch [    9/   10] | d_loss: 0.0869 | g_loss: 6.1938
Epoch [    9/   10] | d_loss: 0.0784 | g_loss: 6.1501
Epoch [    9/   10] | d_loss: 0.1821 | g_loss: 5.0648
Epoch [    9/   10] | d_loss: 0.0972 | g_loss: 4.4470
Epoch [    9/   10] | d_loss: 0.0173 | g_loss: 6.8817
Epoch [    9/   10] | d_loss: 0.1212 | g_loss: 6.2651
Epoch [    9/   10] | d_loss: 0.0135 | g_loss: 7.7054
Epoch [    9/   10] | d_loss: 0.0298 | g_loss: 7.1837
Epoch [    9/   10] | d_loss: 0.1158 | g_loss: 7.3685
Epoch [    9/   10] | d_loss: 0.1447 | g_loss: 5.0622
Epoch [    9/   10] | d_loss: 0.2948 | g_loss: 3.7220
Epoch [    9/   10] | d_loss: 0.0858 | g_loss: 5.6277
Epoch [    9/   10] | d_loss: 0.0246 | g_loss: 6.9544
Epoch [    9/   10] | d_loss: 0.2539 | g_loss: 2.5089
Epoch [    9/   10] | d_loss: 0.0097 | g_loss: 5.7004
Epoch [    9/   10] | d_loss: 0.0277 | g_loss: 5.0174
Epoch [    9/   10] | d_loss: 0.0210 | g_loss: 7.3393
Epoch [    9/   10] | d_loss: 0.0704 | g_loss: 2.9685
Epoch [    9/   10] | d_loss: 0.1269 | g_loss: 5.3173
Epoch [    9/   10] | d_loss: 0.4942 | g_loss: 3.3350
Epoch [    9/   10] | d_loss: 0.0391 | g_loss: 4.0708
Epoch [    9/   10] | d_loss: 0.0269 | g_loss: 5.1471
Epoch [    9/   10] | d_loss: 0.0372 | g_loss: 4.4229
Epoch [    9/   10] | d_loss: 0.1152 | g_loss: 1.7167
Epoch [    9/   10] | d_loss: 0.5370 | g_loss: 3.5165
Epoch [    9/   10] | d_loss: 0.0039 | g_loss: 5.3864
Epoch [    9/   10] | d_loss: 0.0374 | g_loss: 5.6930
Epoch [    9/   10] | d_loss: 0.0199 | g_loss: 6.3860
Epoch [    9/   10] | d_loss: 0.0437 | g_loss: 4.3603
Epoch [    9/   10] | d_loss: 0.0636 | g_loss: 3.7599
Epoch [    9/   10] | d_loss: 0.6410 | g_loss: 2.4903
Epoch [    9/   10] | d_loss: 0.7800 | g_loss: 6.3820
Epoch [    9/   10] | d_loss: 0.3770 | g_loss: 6.9263
Epoch [    9/   10] | d_loss: 0.5202 | g_loss: 3.0650
Epoch [   10/   10] | d_loss: 0.0208 | g_loss: 3.7870
Epoch [   10/   10] | d_loss: 0.2855 | g_loss: 4.8292
Epoch [   10/   10] | d_loss: 0.0182 | g_loss: 7.0353
Epoch [   10/   10] | d_loss: 0.0408 | g_loss: 6.4111
Epoch [   10/   10] | d_loss: 0.0952 | g_loss: 4.5399
Epoch [   10/   10] | d_loss: 0.2483 | g_loss: 3.1488
Epoch [   10/   10] | d_loss: 0.1151 | g_loss: 5.2510
Epoch [   10/   10] | d_loss: 0.0544 | g_loss: 2.7521
Epoch [   10/   10] | d_loss: 0.0253 | g_loss: 3.6003
Epoch [   10/   10] | d_loss: 0.0394 | g_loss: 5.3613
Epoch [   10/   10] | d_loss: 0.0157 | g_loss: 6.3726
Epoch [   10/   10] | d_loss: 1.1189 | g_loss: 2.0506
Epoch [   10/   10] | d_loss: 0.0102 | g_loss: 7.4515
Epoch [   10/   10] | d_loss: 0.2272 | g_loss: 3.8293
Epoch [   10/   10] | d_loss: 0.2422 | g_loss: 5.2184
Epoch [   10/   10] | d_loss: 0.0504 | g_loss: 3.1540
Epoch [   10/   10] | d_loss: 0.2595 | g_loss: 2.7259
Epoch [   10/   10] | d_loss: 0.0180 | g_loss: 6.5554
Epoch [   10/   10] | d_loss: 0.1382 | g_loss: 5.4461
Epoch [   10/   10] | d_loss: 0.0099 | g_loss: 4.6016
Epoch [   10/   10] | d_loss: 0.1584 | g_loss: 3.6194
Epoch [   10/   10] | d_loss: 0.0294 | g_loss: 8.6336
Epoch [   10/   10] | d_loss: 0.0147 | g_loss: 6.0058
Epoch [   10/   10] | d_loss: 0.0124 | g_loss: 6.2966
Epoch [   10/   10] | d_loss: 0.0303 | g_loss: 7.1824
Epoch [   10/   10] | d_loss: 0.0235 | g_loss: 4.1537
Epoch [   10/   10] | d_loss: 0.0326 | g_loss: 4.5685
Epoch [   10/   10] | d_loss: 0.2365 | g_loss: 4.3529
Epoch [   10/   10] | d_loss: 0.3841 | g_loss: 3.1139
Epoch [   10/   10] | d_loss: 0.1267 | g_loss: 5.7291
Epoch [   10/   10] | d_loss: 0.0457 | g_loss: 4.7545
Epoch [   10/   10] | d_loss: 0.0200 | g_loss: 5.7397
Epoch [   10/   10] | d_loss: 0.0092 | g_loss: 5.5821
Epoch [   10/   10] | d_loss: 0.0717 | g_loss: 5.4005
Epoch [   10/   10] | d_loss: 0.0250 | g_loss: 3.9849
Epoch [   10/   10] | d_loss: 0.0136 | g_loss: 7.3404
Epoch [   10/   10] | d_loss: 0.0098 | g_loss: 5.6649
Epoch [   10/   10] | d_loss: 0.0136 | g_loss: 6.6803
Epoch [   10/   10] | d_loss: 0.0122 | g_loss: 5.7824
Epoch [   10/   10] | d_loss: 0.0409 | g_loss: 4.2762
Epoch [   10/   10] | d_loss: 0.0711 | g_loss: 5.8411
Epoch [   10/   10] | d_loss: 0.4731 | g_loss: 6.9981
Epoch [   10/   10] | d_loss: 0.0379 | g_loss: 8.2671
Epoch [   10/   10] | d_loss: 0.0546 | g_loss: 4.9182
Epoch [   10/   10] | d_loss: 0.1052 | g_loss: 5.3334
Epoch [   10/   10] | d_loss: 0.0995 | g_loss: 6.0133
Epoch [   10/   10] | d_loss: 0.1513 | g_loss: 4.4162
Epoch [   10/   10] | d_loss: 0.0153 | g_loss: 3.9488
Epoch [   10/   10] | d_loss: 0.1670 | g_loss: 6.4623
Epoch [   10/   10] | d_loss: 0.1229 | g_loss: 5.5721
Epoch [   10/   10] | d_loss: 0.3467 | g_loss: 6.3732
Epoch [   10/   10] | d_loss: 0.1115 | g_loss: 5.8688
Epoch [   10/   10] | d_loss: 0.0549 | g_loss: 6.4606
Epoch [   10/   10] | d_loss: 0.4343 | g_loss: 3.5590
Epoch [   10/   10] | d_loss: 0.0360 | g_loss: 6.6444
Epoch [   10/   10] | d_loss: 0.1199 | g_loss: 3.8239
Epoch [   10/   10] | d_loss: 0.0907 | g_loss: 2.4722
Epoch [   10/   10] | d_loss: 0.1132 | g_loss: 2.3650
Epoch [   10/   10] | d_loss: 0.0167 | g_loss: 4.9647
Epoch [   10/   10] | d_loss: 0.0113 | g_loss: 4.7846
Epoch [   10/   10] | d_loss: 1.1302 | g_loss: 2.2145
Epoch [   10/   10] | d_loss: 0.0391 | g_loss: 2.2645
Epoch [   10/   10] | d_loss: 0.1266 | g_loss: 5.1298
Epoch [   10/   10] | d_loss: 0.2144 | g_loss: 3.3508
Epoch [   10/   10] | d_loss: 0.0727 | g_loss: 5.0634
Epoch [   10/   10] | d_loss: 0.1298 | g_loss: 5.6992
Epoch [   10/   10] | d_loss: 0.0534 | g_loss: 3.3627
Epoch [   10/   10] | d_loss: 0.0361 | g_loss: 7.4630
Epoch [   10/   10] | d_loss: 0.0106 | g_loss: 5.4509
Epoch [   10/   10] | d_loss: 0.0193 | g_loss: 4.7274
Epoch [   10/   10] | d_loss: 0.0092 | g_loss: 6.5354
Epoch [   10/   10] | d_loss: 0.0371 | g_loss: 4.4569
Epoch [   10/   10] | d_loss: 0.0861 | g_loss: 5.8133
Epoch [   10/   10] | d_loss: 0.1041 | g_loss: 5.1991
Epoch [   10/   10] | d_loss: 0.0731 | g_loss: 3.8671
Epoch [   10/   10] | d_loss: 0.3180 | g_loss: 6.9120
Epoch [   10/   10] | d_loss: 0.0146 | g_loss: 4.4931
Epoch [   10/   10] | d_loss: 0.0199 | g_loss: 7.4146
Epoch [   10/   10] | d_loss: 0.0480 | g_loss: 3.7251
Epoch [   10/   10] | d_loss: 0.0229 | g_loss: 5.4488
Epoch [   10/   10] | d_loss: 0.0334 | g_loss: 6.6909
Epoch [   10/   10] | d_loss: 0.0229 | g_loss: 5.9383
Epoch [   10/   10] | d_loss: 0.0299 | g_loss: 3.1789
Epoch [   10/   10] | d_loss: 0.0062 | g_loss: 6.6405
Epoch [   10/   10] | d_loss: 0.0577 | g_loss: 8.1050
Epoch [   10/   10] | d_loss: 0.0293 | g_loss: 4.8804
Epoch [   10/   10] | d_loss: 0.0109 | g_loss: 5.2012
Epoch [   10/   10] | d_loss: 0.0338 | g_loss: 2.3219
Epoch [   10/   10] | d_loss: 0.0063 | g_loss: 3.6236
Epoch [   10/   10] | d_loss: 0.0111 | g_loss: 5.5808
Epoch [   10/   10] | d_loss: 0.2970 | g_loss: 7.3816
Epoch [   10/   10] | d_loss: 0.0162 | g_loss: 7.1190
Epoch [   10/   10] | d_loss: 0.3230 | g_loss: 3.3509
Epoch [   10/   10] | d_loss: 0.1059 | g_loss: 4.2677
Epoch [   10/   10] | d_loss: 0.0103 | g_loss: 5.7178
Epoch [   10/   10] | d_loss: 0.0247 | g_loss: 5.9528
Epoch [   10/   10] | d_loss: 0.1266 | g_loss: 4.3778
Epoch [   10/   10] | d_loss: 0.1208 | g_loss: 5.8490
Epoch [   10/   10] | d_loss: 0.0449 | g_loss: 5.9282
Epoch [   10/   10] | d_loss: 0.1024 | g_loss: 5.6794
Epoch [   10/   10] | d_loss: 0.0252 | g_loss: 5.5530
Epoch [   10/   10] | d_loss: 0.0267 | g_loss: 6.3452
Epoch [   10/   10] | d_loss: 0.0204 | g_loss: 6.1864
Epoch [   10/   10] | d_loss: 0.0673 | g_loss: 6.5871
Epoch [   10/   10] | d_loss: 0.1011 | g_loss: 5.6018
Epoch [   10/   10] | d_loss: 0.3042 | g_loss: 8.3634
Epoch [   10/   10] | d_loss: 0.0154 | g_loss: 6.1472
Epoch [   10/   10] | d_loss: 0.0506 | g_loss: 5.7567
Epoch [   10/   10] | d_loss: 0.0026 | g_loss: 5.1970
Epoch [   10/   10] | d_loss: 0.1024 | g_loss: 5.5784
Epoch [   10/   10] | d_loss: 0.0230 | g_loss: 5.2716
Epoch [   10/   10] | d_loss: 0.0271 | g_loss: 4.8556
Epoch [   10/   10] | d_loss: 0.0630 | g_loss: 6.7108
Epoch [   10/   10] | d_loss: 0.5874 | g_loss: 5.8737
Epoch [   10/   10] | d_loss: 0.0140 | g_loss: 6.0631
Epoch [   10/   10] | d_loss: 0.1087 | g_loss: 6.9258
Epoch [   10/   10] | d_loss: 0.0552 | g_loss: 3.4844
Epoch [   10/   10] | d_loss: 0.2407 | g_loss: 7.1514
Epoch [   10/   10] | d_loss: 0.0074 | g_loss: 6.8648
Epoch [   10/   10] | d_loss: 0.0843 | g_loss: 5.1074
Epoch [   10/   10] | d_loss: 0.2092 | g_loss: 6.2337
Epoch [   10/   10] | d_loss: 0.0774 | g_loss: 4.3871
Epoch [   10/   10] | d_loss: 0.1497 | g_loss: 3.7420
Epoch [   10/   10] | d_loss: 0.0320 | g_loss: 5.1798
Epoch [   10/   10] | d_loss: 0.0847 | g_loss: 3.4172
Epoch [   10/   10] | d_loss: 0.1077 | g_loss: 6.1443
Epoch [   10/   10] | d_loss: 0.1181 | g_loss: 1.9679

Training loss¶

In [24]:
fig, ax = plt.subplots()
losses = np.array(losses)
plt.plot(losses.T[0], label='Discriminator', alpha=0.5)
plt.plot(losses.T[1], label='Generator', alpha=0.5)
plt.title("Training Losses")
plt.legend()
Out[24]:
<matplotlib.legend.Legend at 0x7fefbdaa0650>

Visualize the generated images¶

In [25]:
def view_samples(epoch, samples):
    fig, axes = plt.subplots(figsize=(16,4), nrows=2, ncols=8, sharey=True, sharex=True)
    for ax, img in zip(axes.flatten(), samples[epoch]):
        img = img.detach().cpu().numpy()
        img = np.transpose(img, (1, 2, 0))
        img = ((img + 1)*255 / (2)).astype(np.uint8)
        ax.xaxis.set_visible(False)
        ax.yaxis.set_visible(False)
        im = ax.imshow(img.reshape((32,32,3)))
In [26]:
with open('train_samples.pkl', 'rb') as f:
    samples = pkl.load(f)
In [27]:
_ = view_samples(-1, samples)
In [ ]:
 
In [ ]: