Clément Vignac, EPFL LTS4 and Guillermo Ortiz Jiménez, EPFL LTS4.
<your team number>
<your name
> (for the indivudual submission) or <the name of all students in the team>
(for the team submission)Grading:
Submission:
In this assignment you will experiment with the main concepts of spectral graph theory, as well as familizarize yourself with the main data science techniques for network data.
The assignment is made of three parts:
import numpy as np
from scipy.spatial.distance import pdist, squareform
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
%matplotlib inline
from pygsp.graphs import TwoMoons
G = TwoMoons(moontype='synthesized', N=2000)
X = G.coords
Y = G.labels.astype(int)
plt.scatter(X[:, 0], X[:, 1], c=Y)
plt.show()
Build a similarity graph using the euclidean distance between data points.
Note: Use an RBF kernel to set the edge weights $w_{ij}=\exp(-||x_i- x_j||_2^2 / ~ 2 \sigma^2)$ of your adjacency and threshold the ones with the smallest magnitude.
def epsilon_similarity_graph(X: np.ndarray, sigma=1, epsilon=0):
""" X (n x d): coordinates of the n data points in R^d.
sigma (float): width of the kernel
epsilon (float): threshold
Return:
adjacency (n x n ndarray): adjacency matrix of the graph.
"""
dist = squareform(pdist(X))
adjacency = np.exp(- dist ** 2 / (2 * sigma ** 2))
adjacency[adjacency < epsilon] = 0
np.fill_diagonal(adjacency, 0)
return adjacency
adjacency = epsilon_similarity_graph(X, sigma=0.5, epsilon=0.1)
plt.spy(adjacency)
plt.show()
How do you choose sigma
?
sigma
reflects a typical (spatial) distance between the points. We want the graph to be connected but we also want it to have two clusters that would correspond to our data. One possible good strategy would be to start with average distance and then reduce it up to the point where we start seeing two separated clusters.
How do you choose the threshold epsilon
?
epsilon
is a sparsity parameter. Epsilon should be reasonably low to keep the weights that have a meaningful distribution (it should be wide enough or simply diverse in terms of values). A good strategy for choosing epsilon
would be to plot the distribution of the weights and tune it accordingly. In this case, the value of epsilon
should around 0.7
.
Build the combinatorial and normalized graph laplacians for this dataset.
def compute_laplacian(adjacency: np.ndarray, normalize: bool):
""" Return:
L (n x n ndarray): combinatorial or symmetric normalized Laplacian.
"""
D = np.diag(np.sum(adjacency, 1)) # Degree matrix
combinatorial = D - adjacency
if normalize:
D_norm = np.diag(np.clip(np.sum(adjacency, 1), 1, None)**(-1/2))
return D_norm @ combinatorial @ D_norm
else:
return combinatorial
laplacian_comb = compute_laplacian(adjacency, normalize=False)
laplacian_norm = compute_laplacian(adjacency, normalize=True)
For both Laplacian matrices, compute the eigendecomposition $L = U^\top \Lambda U$, where the columns $u_k \in \mathbb{R}^N$ of $U = [u_1, \dots, u_N] \in \mathbb{R}^{N \times N}$ are the eigenvectors and the diagonal elements $\lambda_k = \Lambda_{kk}$ are the corresponding eigenvalues. Make sure that the eigenvalues are ordered, i.e., $\lambda_1 \leq \lambda_2 \leq \dots \leq \lambda_N$.
Justify your choice of a solver for the eigendecomposition.
We need a solver that works with real symmetric matrices. Also, we want the values to be sorted. np.linalg.eigh
is a good choice since it satisfies both conditions.
def spectral_decomposition(laplacian: np.ndarray):
""" Return:
lamb (np.array): eigenvalues of the Laplacian
U (np.ndarray): corresponding eigenvectors.
"""
return np.linalg.eigh(laplacian)
lamb_comb, U_comb = spectral_decomposition(laplacian_comb)
lamb_norm, U_norm = spectral_decomposition(laplacian_norm)
We plot the sorted eigenvalues as a function of their index:
plt.figure(figsize=(12,5))
plt.subplot(121)
plt.plot(lamb_comb)
plt.xlabel('Index')
plt.ylabel('Eigenvalue')
plt.title('Eigenvalues $L_{comb}$')
plt.subplot(122)
plt.plot(lamb_norm)
plt.xlabel('Index')
plt.ylabel('Eigenvalue')
plt.title('Eigenvalues $L_{norm}$')
plt.show()
What is the lowest eigenvalue $\lambda_0$ and the corresponding eigenvector $u_0$? Answer for both Laplacian matrices.
For both, combinatorial and normalized, Laplacian matrices, the lowest eigenvalues $\lambda_0$ are 0 (technically, they are not exactly 0 due to a numerical error).
Here is a good detailed answer regarding eigenvectors (team 1):
by the eigenvalue equation we have: $$L u_{0} = \lambda_{0} u_{0}$$ since $\lambda_{0}=0$, then: $$L u_{0} = 0$$ multiply by $u_{0}^{T}$ we get: $$u_{0}^{T} L u_{0} = u_{0}^{T} * 0 = 0$$ but the quadratic form of the combinatorial laplacian is given by: $$u_{0}^{T} L u_{0} = \sum_{(i, j) \in E} w_{i,j}(u_{0}[i] - u_{0}[j])^2$$ hence: $$\sum_{(i, j) \in E} w_{i,j}(u_{0}[i] - u_{0}[j])^2 = 0$$ for this to hold, $u_{0}[i] = u_{0}[j]$ for every edge $(i, j) \in E$. Then, $$u_{0} = c \begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix}$$ Therefore, the value of $u_{0}$ is the unit vector $e$.
if we follow the same argument as before, we have that $$(u_{0}^{'})^{T} L_{n} u_{0}^{'} = 0$$
since $$L_n = D^{-\frac{1}{2}} L D^{-\frac{1}{2}}$$
we get: $$(u_{0}^{'})^{T} D^{-\frac{1}{2}} L D^{-\frac{1}{2}} u_{0}^{'} = 0$$
as we shown this yields that $D^{-\frac{1}{2}} u_{0}^{'}$ is a unit vector $e$.
Therefore, $u_{0}^{'} = D^{\frac{1}{2}} e$
When filtering a signal or computing polynomials, which Laplacian provides the best numerical stability? Justify your answer.
Normalized. Eigenvalues of normalized Laplacian are bounded between 0 and 2 while eigenvalues of combinatorial Laplacian are unbounded and have values proportional to the size of a graph which might make further computations numerically unstable.
The eigendecomposition provides an easy way to compute the number of connected components in the graph. Fill the following function:
def compute_number_connected_components(lamb: np.array, threshold: float):
""" lamb: array of eigenvalues of a Laplacian
Return:
n_components (int): number of connected components.
"""
return np.count_nonzero(lamb <= threshold)
Tune the parameters $\epsilon$ and $\sigma$ of the similarity graph so that the graph is connected. Otherwise, clustering would be too simple!
print(compute_number_connected_components(lamb_norm, threshold=1e-12))
1
from sklearn.cluster import KMeans
kmeans = KMeans(n_clusters=2)
y_pred = kmeans.fit_predict(X)
plt.scatter(X[:, 0], X[:, 1], c=y_pred)
plt.show()
K-means cannot find a good solution to this problem. Why?
K-means expects clusters that are convex and isotropic (i.e. roughly ball-shaped) and therefore performs poorly with the elongated shapes present in the dataset.
As opposed to naive K-means, spectral clustering doesn't operate on the input space but on the eigenspace of the graph that represents the data. Implement spectral clustering. You can use this tutorial.
class SpectralClustering():
def __init__(self, n_classes: int, normalize: bool):
self.n_classes = n_classes
self.normalize = normalize
self.laplacian = None
self.e = None
self.U = None
self.clustering_method = KMeans(n_classes)
def fit_predict(self, adjacency):
""" Your code should be correct both for the combinatorial
and the symmetric normalized spectral clustering.
Return:
y_pred (np.ndarray): cluster assignments.
"""
self.laplacian = compute_laplacian(adjacency, self.normalize)
self.e, self.U = spectral_decomposition(self.laplacian)
n_connected = compute_number_connected_components(self.e, threshold=1e-12)
first_columns = self.U[:, :self.n_classes]
if self.normalize:
first_columns = first_columns / np.linalg.norm(first_columns, axis=1)[:, None]
y_pred = self.clustering_method.fit_predict(first_columns)
return y_pred
print("Connected components:", compute_number_connected_components(lamb_norm, threshold=1e-12))
spectral_clustering = SpectralClustering(n_classes=2, normalize=True)
y_pred = spectral_clustering.fit_predict(adjacency)
plt.scatter(X[:, 0], X[:, 1], c=y_pred)
plt.show()
Connected components: 1
Can you think of another 2D dataset in which k-means would badly perform, but spectral clustering would not?
Construct it!
For this question you can import any dataset of your choice, for example from sklearn.datasets
or pygsp.graphs
, but you can also get creative and define something of your own. First, create and plot the dataset.
# borrowed from team #05 submission
def Smiley(N = 2000):
""" return a 2D dataset representing a smiley with 4 classes (head, eyes, mouth) """
# Head
length = np.random.uniform(1.75, 2, size=int(2*N/5))
angle = np.pi * np.random.uniform(0, 2, size=int(2*N/5))
X_head = np.stack([length * np.cos(angle), length * np.sin(angle)], axis=1)
Y_head = np.ones(X_head.shape[0])
# eye 1
length = np.random.uniform(0.1, 0.4, size=int(N/5))
angle = np.pi * np.random.uniform(0, 2, size=int(N/5))
X_eye1 = np.stack([-0.75 +length * np.cos(angle), 0.75 +length * np.sin(angle)], axis=1)
Y_eye1 = 2*np.ones(X_eye1.shape[0])
# eye 2
length = np.random.uniform(0.1, 0.4, size=int(N/5))
angle = np.pi * np.random.uniform(0, 2, size=int(N/5))
X_eye2 = np.stack([0.75 +length * np.cos(angle), 0.75 +length * np.sin(angle)], axis=1)
Y_eye2 = 3*np.ones(X_eye2.shape[0])
# mouth
length = np.random.uniform(1, 1.25, size=int(N/5))
angle = np.pi * np.random.uniform(-0.15, -0.85, size=int(N/5))
X_mouth = np.stack([length * np.cos(angle), -0.2 +length * np.sin(angle)], axis=1)
Y_mouth = 4*np.ones(X_mouth.shape[0])
X = np.concatenate([X_head, X_eye1, X_eye2, X_mouth], axis=0)
Y = np.concatenate([Y_head, Y_eye1, Y_eye2, Y_mouth], axis=0)
return X, Y
X_s, Y_s = Smiley(1000)
fig, ax = plt.subplots(1,1,figsize=(5,5))
ax.scatter(X_s[:, 0], X_s[:, 1], c=Y_s, cmap='coolwarm')
ax.set_title('Smiley dataset (4 cluster)')
plt.show()
Run K-means:
kmeans = KMeans(n_clusters=4)
y_pred_s = kmeans.fit_predict(X_s)
fig, ax = plt.subplots(1,1,figsize=(5,5))
plt.scatter(X_s[:, 0], X_s[:, 1], c=y_pred_s, cmap='coolwarm')
plt.show()
Create the similarity graph, and run spectral clustering with both the combinatorial and normalized Laplacian matrices:
adjacency_s = epsilon_similarity_graph(X_s, sigma=0.3, epsilon=0.5)
lamb_norm_s, _ = spectral_decomposition(compute_laplacian(adjacency_s, normalize=True))
print("Connected components:", compute_number_connected_components(lamb_norm_s, threshold=1e-12))
# normalized
spectral_clustering_n = SpectralClustering(n_classes=4, normalize=True)
y_pred_s_norm = spectral_clustering.fit_predict(adjacency_s)
# non normalized
spectral_clustering = SpectralClustering(n_classes=4, normalize=False)
y_pred_s = spectral_clustering.fit_predict(adjacency_s)
fig, ax = plt.subplots(1,2,figsize=(12,5))
ax[0].scatter(X_s[:, 0], X_s[:, 1], c=y_pred_s, cmap='coolwarm')
ax[0].set_title('Spectral clustering - non normalized')
ax[1].scatter(X_s[:, 0], X_s[:, 1], c=y_pred_s_norm, cmap='coolwarm')
ax[1].set_title('Spectral clustering - normalized')
plt.show()
Connected components: 1
K-means also performs poorly on this dataset because of the non-convex/non-spherical clusters it contains. Spectral clustering works well regardless of normalization, provided the graph is constructed with the appropriate parameters $\sigma$ and $\epsilon$. The choice of those parameters is (again) crucial for spectral clustering to work as expected.
Most datasets are very high-dimensional, which means it can be very hard to understand their geometry. Fortunately, there exists multiple techniques that can help us to reduce the dimensionality of the data, and allow us to visualize it.
In this part of the assignment we will use MNIST to compare these techniques. Indeed, without dimensionality reduction it would be very difficult to answer questions like: are the different digits clustered together in different areas of space?
But first, let's load our dataset:
from utils import load_mnist
X_mnist, y_mnist = load_mnist()
classes = np.unique(y_mnist)
Most dimensionality reduction algorithms are constructed such that some property of the dataset remains invariant in the lower dimensional representation. Before implementing laplacian eigenmaps, can you say what property of the data does this algorithm preserve?
Solution:
Laplacian eigenmaps make the assumption that observations low on a low-dimensional possibly non linear manifold. They aim at preserving proximity of points on the manifold.
Implement a function that uses Laplacian eigenmaps to do dimensionality reduction. Solution (from team 3):
def laplacian_eigenmaps(X:np.ndarray, dim: int, sigma: float, epsilon: float, normalize: bool):
""" Return:
coords (n x dim array): new coordinates for the data points."""
adjacency = epsilon_similarity_graph(X, sigma, epsilon)
laplacian = compute_laplacian(adjacency, normalize)
lamb, U = spectral_decomposition(laplacian)
# number of connected components = number of zero eigenvalues,
# zero eigenvalues are associated with constant vectors
n_CC = compute_number_connected_components(lamb, threshold=1e-12)
# only take columns associated with non-zero eigenvalues
cols_to_take = range(n_CC, n_CC + dim + 1)
coords = U[:, cols_to_take]
return coords
Use this function to visualize MNIST in 2D. Feel free to play with the different parameters.
dim = 2
sigma = 2e3
epsilon = 0
normalize = True
X_2d = laplacian_eigenmaps(X_mnist, dim, sigma, epsilon, normalize)
for i in classes:
mask = y_mnist == i
plt.scatter(X_2d[mask, 0], X_2d[mask, 1], label=i)
plt.legend()
plt.title("Visualization of MNIST in 2d using Laplacian eigenmaps")
plt.show()
Visualize MNIST in 3D:
dim = 3
sigma = 2e3
epsilon = 0
normalize = True
X_3d = laplacian_eigenmaps(X_mnist, dim, sigma, epsilon, normalize)
fig = plt.figure()
ax = Axes3D(fig)
for i in classes:
mask = y_mnist == i
ax.scatter(X_3d[mask, 0], X_3d[mask, 1], X_3d[mask, 2], label=i)
plt.legend()
plt.title("Visualization of MNIST in 3d using Laplacian eigenmaps")
plt.show()
We provide the visualization of MNIST with other methods:
from sklearn.decomposition import PCA
from sklearn.manifold import TSNE, Isomap
# This cell can take a few minutes to run
run_this_cell = True
if run_this_cell:
# In 2d
embeddings = [PCA(n_components=2, copy=True, whiten=True, tol=1e-5),
Isomap(n_components=2, n_neighbors=5),
TSNE(n_components=2)]
for embedding in embeddings:
X_embedded = embedding.fit_transform(X_mnist)
fig = plt.figure()
for i in classes:
mask = y_mnist == i
plt.scatter(X_embedded[mask, 0], X_embedded[mask, 1], label=i)
plt.legend()
plt.title('Embedding method: '+ type(embedding).__name__)
plt.show()
# In 3d
embeddings = [PCA(n_components=3, copy=True, whiten=True, tol=1e-5),
Isomap(n_components=3, n_neighbors=5),
TSNE(n_components=3)]
for embedding in embeddings:
X_embedded = embedding.fit_transform(X_mnist)
fig = plt.figure()
ax = Axes3D(fig)
for i in classes:
mask = y_mnist == i
ax.scatter(X_embedded[mask, 0], X_embedded[mask, 1], X_embedded[mask, 2], label=i)
ax.legend()
ax.title.set_text('Embedding method: '+ type(embedding).__name__)
plt.show()
In a few words, what are the principles guiding the design of each method? Compare their results.
Solution (from team 3):
PCA is a linear method that uses the $k$ largest singular values from the singular value decomposition (SVD) of the data matrix. These axes form the best linear subspace of dimension $k$ because the variance of the orthogonal projection of the data points is maximal on it. It is the method of choice if the data can be summarized as linear combinations of features.
Isomap is a non-linear method that starts with the conversion of the data matrix to a graph, then the shortest path matrix is computed, and the linear method Multi-Dimensional Scaling method is applied on the shortest path matrix. The advantages of Isomap are that it is able to discover manifolds of arbitrary dimensionality and it is guaranteed to converge to the global optimal solution. We can see that it produces a better separation of the classes than PCA on the MNIST dataset.
T-SNE is yet another non-linear method, which tries to circumvent the crowding problem, i.e. when a lot of data points are constrained in a small part of space. It does this by minimizing the divergence of distributions of the data points transformed into conditional probabilities and a measure of similarity between the new points in the sub-space $\mathbb{R}^k$. It usually produces better visualizations than the other methods thanks to its crowding-circumvention property, and we can see that it is the case here.
In this part of the assignment we are going to familiarize ourselves with the main concepts in Graph Signal Processing and regularization on graphs in general. From now on, you can only use the following libraries as well as the functions that you implemented in the previous parts.
import pandas as pd
import numpy as np
from pygsp.graphs import Bunny
In this exercise we will use a nearest-neighbor graph constructed from the Stanford Bunny point cloud included in the PyGSP library.
G = Bunny()
adjacency = np.asarray(G.W.todense())
n_nodes = adjacency.shape[0]
We will use the following function to plot our signals on this graph.
def plot_bunny(x=None, title='', vlim=[-0.03, 0.03]):
fig = plt.gcf()
ax = plt.gca()
if not isinstance(ax, Axes3D):
ax = plt.subplot(111, projection='3d')
if x is not None:
x = np.squeeze(x)
p = ax.scatter(G.coords[:,0], G.coords[:,1], G.coords[:,2], c=x, marker='o',
s=5, cmap='RdBu_r', vmin=vlim[0], vmax=vlim[1])
ax.view_init(elev=-90, azim=90)
ax.dist = 7
ax.set_axis_off()
ax.set_title(title)
if x is not None:
fig.colorbar(p)
plt.subplot(111, projection='3d')
plot_bunny()
Let us start by constructing the normalized graph laplacians from the adjacency matrix and find its spectral decomposition.
laplacian = compute_laplacian(adjacency, normalize=True)
lam, U = spectral_decomposition(laplacian)
Plot the eigenvalues.
plt.figure(figsize=(6, 5))
plt.plot(lam)
plt.title('Eigenvalues $L_{norm}$')
plt.show()
To make things more clear we will plot some of its eigenvectors (0, 1, 3, 10, 100) as signals on the bunny graph.
plt.figure(figsize=(18, 9))
plt.subplot(231, projection='3d')
plot_bunny(x=U[:,0], title='Eigenvector #0')
plt.subplot(232, projection='3d')
plot_bunny(x=U[:,1], title='Eigenvector #1')
plt.subplot(233, projection='3d')
plot_bunny(x=U[:,2], title='Eigenvector #2')
plt.subplot(234, projection='3d')
plot_bunny(x=U[:,3], title='Eigenvector #3')
plt.subplot(235, projection='3d')
plot_bunny(x=U[:,10], title='Eigenvector #10')
plt.subplot(236, projection='3d')
plot_bunny(x=U[:,100], title='Eigenvector #100')
What can you say in terms of the variation (smoothness) of these signals? How can the smoothness of a signal be measured?
Solution: These signals become less and less smooth as the corresponding eigenvalue increases. In general, the inverse of the smoothness of a signal on a graph can be measured by the quadratic form of the Laplacian $x^T L X = \sum_{(i, j) \in \mathcal E} w_{i, j} \|x_i - x_j \|^ 2$. This quantity can also be seen as the square norm of the graph gradient.
Create a function to compute the Graph Fourier Transform (GFT) of a graph signal and its inverse.
Note: You can assume that you have internal access to the eigendecomposition (U
and lam
) of the laplacian.
def GFT(signal: np.ndarray):
return U.T @ signal
def iGFT(fourier_coefficients: np.ndarray):
return U @ fourier_coefficients
Now, let's create a graph signal:
x = G.coords[:, 0] + G.coords[:, 1] + 3 * G.coords[:, 2]
x /= np.linalg.norm(x)
noise = np.random.randn(n_nodes)
noise /= np.linalg.norm(noise)
x_noisy = x + 0.3*noise
plot_bunny(x_noisy, vlim=[min(x_noisy), max(x_noisy)])
/home/michael/.conda/envs/ntds_2019/lib/python3.7/site-packages/ipykernel_launcher.py:5: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance. """
and plot its graph spectrum:
plt.figure(figsize=(10, 6))
plt.plot(lam, np.abs(GFT(x_noisy)), 'r.')
plt.plot(lam, np.abs(GFT(x)), 'g-')
plt.xlabel('$\lambda$')
plt.ylabel('GFT')
plt.legend(['$x_{noisy}$', '$x$'])
<matplotlib.legend.Legend at 0x7fc16c69d780>
We will try to extract the signal from the noise using graph filters. Let us start by creating three ideal graph filters.
ideal_lp = np.ones((n_nodes,))
ideal_bp = np.ones((n_nodes,))
ideal_hp = np.ones((n_nodes,))
ideal_lp[lam >= 0.1] = 0 # Low-pass filter with cut-off at lambda=0.1
ideal_bp[lam < 0.1] = 0 # Band-pass filter with cut-offs at lambda=0.1 and lambda=0.5
ideal_bp[lam > 0.5] = 0
ideal_hp[lam <= 1] = 0 # High-pass filter with cut-off at lambda=1
Additionally, create the ideal graph filter that implements the solution of Tikhonov regularization.
alpha = 0.99 / np.max(lam)
ideal_tk = np.ones((n_nodes,))
ideal_tk = 1 / (1 + alpha*lam)
Let's plot the spectral responses:
plt.plot(lam, ideal_lp, '-', label='LP')
plt.plot(lam, ideal_bp, '-', label='BP')
plt.plot(lam, ideal_hp, '-', label='HP')
plt.plot(lam, ideal_tk, '-', label='Tikhonov')
plt.xlabel('$\lambda$')
plt.ylabel('Spectral response')
plt.legend(loc='lower right')
<matplotlib.legend.Legend at 0x7fc1660b2438>
Create a function to filter a signal given an ideal graph filter
def ideal_graph_filter(x: np.ndarray, spectral_response: np.ndarray):
"""Return a filtered signal."""
x_gft = GFT(x)
filter_gft = x_gft * spectral_response
return iGFT(filter_gft)
Let us visualize the results:
x_lp = ideal_graph_filter(x_noisy,ideal_lp)
x_bp = ideal_graph_filter(x_noisy,ideal_bp)
x_hp = ideal_graph_filter(x_noisy,ideal_hp)
x_tk = ideal_graph_filter(x_noisy,ideal_tk)
plt.figure(figsize=(18, 9))
plt.subplot(231, projection='3d')
plot_bunny(x=x, title='signal (true)', vlim=[min(x), max(x)])
plt.subplot(232, projection='3d')
plot_bunny(x=x_noisy, title='signal (noisy)', vlim=[min(x), max(x)])
plt.subplot(233, projection='3d')
plot_bunny(x=x_lp, title='Low-pass', vlim=[min(x_lp), max(x_lp)])
plt.subplot(234, projection='3d')
plot_bunny(x=x_bp, title='Band-pass', vlim=[min(x_bp), max(x_bp)])
plt.subplot(235, projection='3d')
plot_bunny(x=x_hp, title='High-pass', vlim=[min(x_hp), max(x_hp)])
plt.subplot(236, projection='3d')
plot_bunny(x=x_tk, title='Tikhonov denoised signal', vlim=[min(x_tk), max(x_tk)])
How would you link to the observations you made before about the spectral decomposition of the laplacian? Also, judging from the results, what type of model prior do you think Tikhonov regularization enforces?
Solution: Graph filtering as an operation that scales the coordinates of a graph signal in the basis given by the spectral decomposition of the laplacian. In this sense, a low pass filter only preserves the components associated with the smallest eigenvalues (and hence it smoothens the signal), a high pass filter preserves the components associated with the largest eignevalues (and hence it produces signals with rapid spatial variations), and a band pass filter preserves the components in between (and produces a mildly smooth signal).
Looking at the spectral response of the Tikhonov filter we see that it weights down the components associated with large eigenvalues, and preserves the low frequencies. We thus say that this is a low pass filter.
We have seen how we can use the GFT to define different filters that enhance or reduce certain frequency bands. However, to do so, we require an explicit eigendecomposition of the graph laplacian, which has a cost $O(n^3)$. For very large graphs this is very intense computationally. We will now see how we can obtain similar results by filtering the signals directly without resorting to an eigendecomposition.
The key idea is to use a polynomial of the graph laplacian to define a graph filter, i.e., $g(L)x=\sum_{k=1}^K \alpha_k L^k x$, and use the fact that the powers of a diagonalizable matrix can be written in terms of powers of its eigenvalues. This is $$ L^k=(U\Lambda U^T)^k=U\Lambda^k U^T = U\begin{bmatrix} (\lambda_0)^k &\dots & 0\\ \vdots & \ddots & \vdots\\ 0 & \dots & (\lambda_N)^k \end{bmatrix} U^T. $$
This means that a polynomial of the graph laplacian acts independently on each eigenvalue of the graph, and has a frequency spectrum of $$g(\lambda)=\sum_{k=1}^K \alpha_k \lambda^k.$$ Hence, $$g(L)x=\sum_{k=1}^K \alpha_k L^k x=\sum_{k=1}^K \alpha_k U\Lambda^k U^T x=U \left(\sum_{k=1}^K \alpha_k\Lambda^k \right)U^T x=\operatorname{iGFT}\left(g(\Lambda)\operatorname{GFT}(x)\right).$$
With these ingredients, we have reduced the design of graph filters in the vertex domain to a regression task that approximates a given spectral response by a polynomial. There are multiple ways to do this, but in this assignment we will implement a very simple strategy based on least-squares regression.
Implement a function to find the coefficients of a polynomial that approximates a given ideal filter.
Hint: np.vander
and np.linalg.lstsq
.
def fit_polynomial(lam: np.ndarray, order: int, spectral_response: np.ndarray):
""" Return an array of polynomial coefficients of length 'order'."""
A = np.vander(lam, order, increasing=True)
coeff = np.linalg.lstsq(A, spectral_response, rcond=None)[0]
return coeff
Implement a function to compute the frequency response of that filter.
def polynomial_graph_filter_response(coeff: np.array, lam: np.ndarray):
""" Return an array of the same shape as lam.
response[i] is the spectral response at frequency lam[i]. """
response = np.zeros_like(lam)
for n, c in enumerate(coeff):
response += c * (lam**n)
return response
Let us fit the Tikhonov ideal filter with several polynomials of different order.
plt.plot(lam, ideal_tk)
orders = [1, 2, 3, 5, 10, 20]
for order in orders:
coeff_tk = fit_polynomial(lam, order, ideal_tk)
plt.plot(lam, polynomial_graph_filter_response(coeff_tk, lam))
plt.xlabel('$\lambda$')
plt.ylabel('Spectral response')
plt.legend(orders)
<matplotlib.legend.Legend at 0x7fc166137208>
So far, we have only defined a way to compute the coefficients of our laplacian polynomial. Let us now compute our graph filter.
def polynomial_graph_filter(coeff: np.array, laplacian: np.ndarray):
""" Return the laplacian polynomial with coefficients 'coeff'. """
power = np.eye(laplacian.shape[0])
filt = coeff[0] * power
for n, c in enumerate(coeff[1:]):
power = laplacian @ power
filt += c * power
return filt
Based on the previous plot, choose a filter order that achieves (in your opinion) a good tradeoff in terms of computational complexity and response accuracy.
order = 3
coeff_tk = fit_polynomial(lam, order, ideal_tk)
g_tk = polynomial_graph_filter(coeff_tk, laplacian)
As you have seen in class, polynomial graph filters are only one of the ways in which you can approximate ideal graph filters. In this sense, ARMA filters are a natural way to implement Tikhonov denoising on graphs. Let us recall the general solution of the Tikhonov regularized denoising problem
$$y=(I+\alpha L)^{-1}x. $$With a little bit of algebra manipulation we can rewrite this expression as $$ y = -\alpha L y + x, $$ from which we can derive the iterative algorithm $$ y_k = -\alpha L y_{k-1} + x\qquad k=1,2,\dots $$ which is guaranteed to converge as long as $\alpha \lambda_{max} < 1$.
Implement the ARMA version of Tikhonov regularization.
def arma_tikhonov(x: np.ndarray, laplacian: np.ndarray, alpha: float, max_iter=50):
""" Return an array of the same shape as x."""
y = x
for k in range(max_iter):
y = - alpha * laplacian @ y + x
return y
Filter the previous noisy graph signal with the polynomial and ARMA approximations of the ideal Tikhonov filter.
x_tk_polynomial = g_tk @ x_noisy
x_tk_arma = arma_tikhonov(x_noisy, laplacian, alpha)
Let us compare with the previous version.
plt.figure(figsize=(18, 4))
plt.subplot(131, projection='3d')
plot_bunny(x_tk, title='Ideal filter', vlim=[min(x_tk), max(x_tk)])
plt.subplot(132, projection='3d')
plot_bunny(x_tk_polynomial, title='Polynomial filter', vlim=[min(x_tk), max(x_tk)])
plt.subplot(133, projection='3d')
plot_bunny(x_tk_arma, title='ARMA filter', vlim=[min(x_tk), max(x_tk)])
So far, we have only played with toy examples. Let us see the use of these tools in practice! In particular, let us see how we can use some graph filters to construct features to feed a classifier. For this part of the assignment we will import some extra packages.
import time
import networkx as nx
from sklearn.linear_model import LogisticRegression
import torch
import torch.nn as nn
import torch.nn.functional as F
import dgl.function as fn
from dgl import DGLGraph
from dgl.data.citation_graph import load_cora
np.random.seed(0)
torch.manual_seed(1)
<torch._C.Generator at 0x7fc169706510>
We will use the CORA dataset and the citation graph that we created in Assignment 1. However, to simplify the next tasks we will directly use the preprocessed version of this dataset contained within the Deep Graph Library (DGL).
In this assignment, we will interpret CORA's features as multidimensional graph signals living on the citation graph. Our task is to design a classifier that uses these features and the geometry of the graph can identify the type of paper each node represents.
The goal of this exercise is to do semi-supervised learning on graphs.
We assume that we know to which scientific field a small subset of the papers belongs (the ones contained in train_mask
).
The goal is to predict to which field the other papers belong, using both the citation graph and the bag-of-word representation of each paper.
cora = load_cora()
features = torch.FloatTensor(cora.features) # Feature vector for each paper
labels = torch.LongTensor(cora.labels) # The field to which each paper belongs
train_mask = torch.BoolTensor(cora.train_mask) # Mask of nodes selected for training
val_mask = torch.BoolTensor(cora.val_mask) # Mask of nodes selected for validation
test_mask = torch.BoolTensor(cora.test_mask) # Mask of nodes selected for testing
in_feats = features.shape[1]
n_classes = cora.num_labels
n_edges = cora.graph.number_of_edges()
graph = cora.graph
adjacency = np.asarray(nx.to_numpy_matrix(graph))
n_nodes = adjacency.shape[0]
For this exercise we will use the normalized laplacian.
laplacian = compute_laplacian(adjacency, normalize=True)
lam, U = spectral_decomposition(laplacian)
lam_max = np.max(lam)
The simplest classification method consists in ignoring the citation graph and trying to classify the papers using only the features.
In this case, the problem is viewed as a standard classification task.
To train our classifier we will select a few nodes in our graph for training and fit a logistic regression classifier on them.
To avoid overfitting to the test set when we do hyperparameter tuning, we will also select a validation set.
And finally, we will test our classifier on the rest of the nodes.
Hint: use sklearn.linear_model.LogisticRegression
.
train_features = features[train_mask]
train_labels = labels[train_mask]
val_features = features[val_mask]
val_labels = labels[val_mask]
test_features = features[test_mask]
test_labels = labels[test_mask]
log_reg = LogisticRegression(penalty='l2', multi_class="auto", solver="liblinear", C=1e4, fit_intercept=False, max_iter=1000)
log_reg.fit(train_features, train_labels)
LogisticRegression(C=10000.0, class_weight=None, dual=False, fit_intercept=False, intercept_scaling=1, l1_ratio=None, max_iter=1000, multi_class='auto', n_jobs=None, penalty='l2', random_state=None, solver='liblinear', tol=0.0001, verbose=0, warm_start=False)
train_acc = log_reg.score(train_features, train_labels)
val_acc = log_reg.score(val_features, val_labels)
test_acc = log_reg.score(test_features, test_labels)
print('Train accuracy {:.4f} | Validation accuracy {:.4f} | Test accuracy {:.4f}'.format(train_acc, val_acc, test_acc))
Train accuracy 1.0000 | Validation accuracy 0.5967 | Test accuracy 0.6050
That's not a bad start! Now, let's try to improve a bit the results by taking into account the graph structure using tools from GSP. For this purpose, we will design a handcrafted filter that will be used to denoise the signal, before feeding it to a logistic regression.
However, before we start, what hypothesis can you make on the spectral properties of the denoised signal?
We can make the assumption that papers that are connected are similar, therefore making the associated signal smooth. The denoised signal here should then be made mostly of lower frequencies, and we will use a low-pass filter to create new features.
Based on this prior, design an ideal filter response that you believe could enhance important features of the graph.
Note: you just need to design one graph filter that we will apply to all features. Don't design a different filter for each feature.
Note: finding the right filter can be very challenging, don't worry if you can't find it. Just make sure you experiment with a few configurations and parameters.
alpha = 0.99 / lam_max
ideal_filter = np.ones((n_nodes,))
ideal_filter = 1 / (1 + alpha*lam)
Choose a filter order to approximate your filter using laplacian polynomials.
order = 5
coeff = fit_polynomial(lam, order, ideal_filter)
graph_filter = polynomial_graph_filter(coeff, laplacian)
Let's plot the frequency response of your spectral template and its polynomial approximation.
plt.plot(lam, ideal_filter)
plt.plot(lam, polynomial_graph_filter_response(coeff, lam))
plt.legend(['Ideal', 'Polynomial'])
plt.xlabel('$\lambda$')
plt.ylabel('Spectral response')
Text(0, 0.5, 'Spectral response')
Now, let's create the new features.
filtered_features = graph_filter @ features.numpy()
train_features = filtered_features[train_mask,:]
train_labels = labels[train_mask]
val_features = filtered_features[val_mask,:]
val_labels = labels[val_mask]
test_features = filtered_features[test_mask,:]
test_labels = labels[test_mask]
Train another logistic regression classifier on the new features. Remember to play with the regularization parameters to achieve a well performing model.
log_reg = LogisticRegression(penalty='l2', multi_class="auto", solver="liblinear", C=1e4, fit_intercept=False, max_iter=1000)
log_reg.fit(train_features, train_labels)
LogisticRegression(C=10000.0, class_weight=None, dual=False, fit_intercept=False, intercept_scaling=1, l1_ratio=None, max_iter=1000, multi_class='auto', n_jobs=None, penalty='l2', random_state=None, solver='liblinear', tol=0.0001, verbose=0, warm_start=False)
Evaluate your model.
train_acc = log_reg.score(train_features, train_labels)
val_acc = log_reg.score(val_features, val_labels)
test_acc = log_reg.score(test_features, test_labels)
print('Train accuracy {:.4f} | Validation accuracy {:.4f} | Test accuracy {:.4f}'.format(train_acc, val_acc, test_acc))
Train accuracy 1.0000 | Validation accuracy 0.7167 | Test accuracy 0.7000
By now, you will probably have seen that it is challenging to find the right combination of spectral response, filter parameters and regularization method. And in most cases, this is a painstaking job. Wouldn't it be great to automate these tasks?
Fortunately, this is possible if we use the right tools! Specifically, we will see that Graph Convolutional Networks are a great framework to automatize the feature extraction method.
In this exercise, we will follow the same classification pipeline as above, but instead of hand-crafting our filter we will let PyTorch
find the coefficients for us using gradient descent.
In this section, most of the code is already written. Try to understand it and to play with some parameters. It may be useful if you want to solve some learning task in your project.
We start by constructing a LaplacianPolynomial
model in DGL
. It computes the function: $f(X) = \sum_{i=1}^{k} \alpha_i L^i X \theta$ where the trainable parameters are the coefficients $\alpha_i$ and the matrix $\theta$. This function can be interpreted as a filtering of $X$ by $\sum_{i=1}^{k} \alpha_i L^i$ followed by a linear layer.
class LaplacianPolynomial(nn.Module):
def __init__(self,
in_feats: int,
out_feats: int,
k: int,
dropout_prob: float,
norm=True):
super().__init__()
self._in_feats = in_feats
self._out_feats = out_feats
self._k = k
self._norm = norm
# Contains the weights learned by the Laplacian polynomial
self.pol_weights = nn.Parameter(torch.Tensor(self._k + 1))
# Contains the weights learned by the logistic regression (without bias)
self.logr_weights = nn.Parameter(torch.Tensor(in_feats, out_feats))
self.dropout = nn.Dropout(p=dropout_prob)
self.reset_parameters()
def reset_parameters(self):
"""Reinitialize learnable parameters."""
torch.manual_seed(0)
torch.nn.init.xavier_uniform_(self.logr_weights, gain=0.01)
torch.nn.init.normal_(self.pol_weights, mean=0.0, std=1e-3)
def forward(self, graph, feat):
r"""Compute graph convolution.
Notes
-----
* Input shape: :math:`(N, *, \text{in_feats})` where * means any number of additional
dimensions, :math:`N` is the number of nodes.
* Output shape: :math:`(N, *, \text{out_feats})` where all but the last dimension are
the same shape as the input.
Parameters
----------
graph (DGLGraph) : The graph.
feat (torch.Tensor): The input feature
Returns
-------
(torch.Tensor) The output feature
"""
feat = self.dropout(feat)
graph = graph.local_var()
# D^(-1/2)
norm = torch.pow(graph.in_degrees().float().clamp(min=1), -0.5)
shp = norm.shape + (1,) * (feat.dim() - 1)
norm = torch.reshape(norm, shp)
# mult W first to reduce the feature size for aggregation.
feat = torch.matmul(feat, self.logr_weights)
result = self.pol_weights[0] * feat.clone()
for i in range(1, self._k + 1):
old_feat = feat.clone()
if self._norm:
feat = feat * norm
graph.ndata['h'] = feat
# Feat is not modified in place
graph.update_all(fn.copy_src(src='h', out='m'),
fn.sum(msg='m', out='h'))
if self._norm:
graph.ndata['h'] = graph.ndata['h'] * norm
feat = old_feat - graph.ndata['h']
result += self.pol_weights[i] * feat
return result
def extra_repr(self):
"""Set the extra representation of the module,
which will come into effect when printing the model.
"""
summary = 'in={_in_feats}, out={_out_feats}'
summary += ', normalization={_norm}'
return summary.format(**self.__dict__)
Once we have are model ready we just need to create a function that performs one step of our training loop, and another one that evaluates our model.
def train(model, g, features, labels, loss_fcn, train_mask, optimizer):
model.train() # Activate dropout
logits = model(g, features)
loss = loss_fcn(logits[train_mask], labels[train_mask])
optimizer.zero_grad()
loss.backward()
optimizer.step()
return loss
def evaluate(model, g, features, labels, mask):
model.eval() # Deactivate dropout
with torch.no_grad():
logits = model(g, features)[mask] # only compute the evaluation set
labels = labels[mask]
_, indices = torch.max(logits, dim=1)
correct = torch.sum(indices == labels)
return correct.item() * 1.0 / len(labels)
Choose the training parameters.
pol_order = 3
lr = 0.2
weight_decay = 5e-6
n_epochs = 1000
p_dropout = 0.8
And train the classifier end to end.
graph = DGLGraph(cora.graph)
model = LaplacianPolynomial(in_feats, n_classes, pol_order, p_dropout)
loss_fcn = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(),
lr=lr,
weight_decay=weight_decay)
dur = []
for epoch in range(n_epochs):
if epoch >= 3:
t0 = time.time()
loss = train(model, graph, features, labels, loss_fcn, train_mask, optimizer)
if epoch >= 3:
dur.append(time.time() - t0)
acc = evaluate(model, graph, features, labels, val_mask)
print("Epoch {:05d} | Time(s) {:.4f} | Train Loss {:.4f} | Val Accuracy {:.4f}". format(
epoch, np.mean(dur), loss.item(), acc))
print()
acc = evaluate(model, graph, features, labels, test_mask)
print("Test Accuracy {:.4f}".format(acc))
/home/michael/.conda/envs/ntds_2019/lib/python3.7/site-packages/numpy/core/fromnumeric.py:2920: RuntimeWarning: Mean of empty slice. out=out, **kwargs) /home/michael/.conda/envs/ntds_2019/lib/python3.7/site-packages/numpy/core/_methods.py:85: RuntimeWarning: invalid value encountered in double_scalars ret = ret.dtype.type(ret / rcount)
Epoch 00000 | Time(s) nan | Train Loss 1.9459 | Val Accuracy 0.2667 Epoch 00001 | Time(s) nan | Train Loss 1.9221 | Val Accuracy 0.2533 Epoch 00002 | Time(s) nan | Train Loss 1.8533 | Val Accuracy 0.2467 Epoch 00003 | Time(s) 0.0687 | Train Loss 1.7058 | Val Accuracy 0.2700 Epoch 00004 | Time(s) 0.0521 | Train Loss 1.4878 | Val Accuracy 0.2900 Epoch 00005 | Time(s) 0.0459 | Train Loss 1.3438 | Val Accuracy 0.2967 Epoch 00006 | Time(s) 0.0418 | Train Loss 1.0504 | Val Accuracy 0.3067 Epoch 00007 | Time(s) 0.0394 | Train Loss 1.1285 | Val Accuracy 0.3167 Epoch 00008 | Time(s) 0.0387 | Train Loss 0.8329 | Val Accuracy 0.3200 Epoch 00009 | Time(s) 0.0378 | Train Loss 0.9067 | Val Accuracy 0.3167 Epoch 00010 | Time(s) 0.0375 | Train Loss 0.8401 | Val Accuracy 0.3133 Epoch 00011 | Time(s) 0.0368 | Train Loss 1.0514 | Val Accuracy 0.3133 Epoch 00012 | Time(s) 0.0367 | Train Loss 1.1378 | Val Accuracy 0.3300 Epoch 00013 | Time(s) 0.0372 | Train Loss 1.1666 | Val Accuracy 0.3200 Epoch 00014 | Time(s) 0.0387 | Train Loss 0.8151 | Val Accuracy 0.3167 Epoch 00015 | Time(s) 0.0386 | Train Loss 0.8316 | Val Accuracy 0.3167 Epoch 00016 | Time(s) 0.0391 | Train Loss 0.8922 | Val Accuracy 0.3167 Epoch 00017 | Time(s) 0.0393 | Train Loss 0.7648 | Val Accuracy 0.3200 Epoch 00018 | Time(s) 0.0429 | Train Loss 0.7275 | Val Accuracy 0.3200 Epoch 00019 | Time(s) 0.0430 | Train Loss 0.9363 | Val Accuracy 0.3300 Epoch 00020 | Time(s) 0.0436 | Train Loss 0.7085 | Val Accuracy 0.3233 Epoch 00021 | Time(s) 0.0441 | Train Loss 0.7683 | Val Accuracy 0.3200 Epoch 00022 | Time(s) 0.0440 | Train Loss 0.7049 | Val Accuracy 0.3300 Epoch 00023 | Time(s) 0.0439 | Train Loss 0.7990 | Val Accuracy 0.3300 Epoch 00024 | Time(s) 0.0447 | Train Loss 0.7614 | Val Accuracy 0.3433 Epoch 00025 | Time(s) 0.0455 | Train Loss 0.7960 | Val Accuracy 0.3467 Epoch 00026 | Time(s) 0.0456 | Train Loss 0.6679 | Val Accuracy 0.3467 Epoch 00027 | Time(s) 0.0452 | Train Loss 0.7161 | Val Accuracy 0.3467 Epoch 00028 | Time(s) 0.0450 | Train Loss 0.6409 | Val Accuracy 0.3533 Epoch 00029 | Time(s) 0.0453 | Train Loss 0.6874 | Val Accuracy 0.3533 Epoch 00030 | Time(s) 0.0457 | Train Loss 0.7622 | Val Accuracy 0.3533 Epoch 00031 | Time(s) 0.0457 | Train Loss 0.5464 | Val Accuracy 0.3533 Epoch 00032 | Time(s) 0.0454 | Train Loss 0.6318 | Val Accuracy 0.3533 Epoch 00033 | Time(s) 0.0451 | Train Loss 0.6195 | Val Accuracy 0.3533 Epoch 00034 | Time(s) 0.0455 | Train Loss 0.4815 | Val Accuracy 0.3467 Epoch 00035 | Time(s) 0.0457 | Train Loss 0.6728 | Val Accuracy 0.3433 Epoch 00036 | Time(s) 0.0451 | Train Loss 0.6106 | Val Accuracy 0.3400 Epoch 00037 | Time(s) 0.0446 | Train Loss 0.6110 | Val Accuracy 0.3300 Epoch 00038 | Time(s) 0.0441 | Train Loss 0.6327 | Val Accuracy 0.3300 Epoch 00039 | Time(s) 0.0436 | Train Loss 0.5988 | Val Accuracy 0.3333 Epoch 00040 | Time(s) 0.0435 | Train Loss 0.5341 | Val Accuracy 0.3333 Epoch 00041 | Time(s) 0.0434 | Train Loss 0.4669 | Val Accuracy 0.3333 Epoch 00042 | Time(s) 0.0432 | Train Loss 0.5973 | Val Accuracy 0.3333 Epoch 00043 | Time(s) 0.0435 | Train Loss 0.5050 | Val Accuracy 0.3333 Epoch 00044 | Time(s) 0.0433 | Train Loss 0.3758 | Val Accuracy 0.3367 Epoch 00045 | Time(s) 0.0431 | Train Loss 0.3994 | Val Accuracy 0.3433 Epoch 00046 | Time(s) 0.0428 | Train Loss 0.5454 | Val Accuracy 0.3433 Epoch 00047 | Time(s) 0.0424 | Train Loss 0.3418 | Val Accuracy 0.3433 Epoch 00048 | Time(s) 0.0421 | Train Loss 0.3225 | Val Accuracy 0.3400 Epoch 00049 | Time(s) 0.0417 | Train Loss 0.4310 | Val Accuracy 0.3400 Epoch 00050 | Time(s) 0.0415 | Train Loss 0.4654 | Val Accuracy 0.3433 Epoch 00051 | Time(s) 0.0413 | Train Loss 0.3416 | Val Accuracy 0.3433 Epoch 00052 | Time(s) 0.0412 | Train Loss 0.4344 | Val Accuracy 0.3433 Epoch 00053 | Time(s) 0.0410 | Train Loss 0.3267 | Val Accuracy 0.3433 Epoch 00054 | Time(s) 0.0408 | Train Loss 0.5606 | Val Accuracy 0.3400 Epoch 00055 | Time(s) 0.0405 | Train Loss 0.5156 | Val Accuracy 0.3400 Epoch 00056 | Time(s) 0.0403 | Train Loss 0.4624 | Val Accuracy 0.3433 Epoch 00057 | Time(s) 0.0400 | Train Loss 0.4255 | Val Accuracy 0.3467 Epoch 00058 | Time(s) 0.0398 | Train Loss 0.3553 | Val Accuracy 0.3500 Epoch 00059 | Time(s) 0.0397 | Train Loss 0.5239 | Val Accuracy 0.3567 Epoch 00060 | Time(s) 0.0396 | Train Loss 0.4148 | Val Accuracy 0.3567 Epoch 00061 | Time(s) 0.0394 | Train Loss 0.4882 | Val Accuracy 0.3533 Epoch 00062 | Time(s) 0.0392 | Train Loss 0.3427 | Val Accuracy 0.3533 Epoch 00063 | Time(s) 0.0391 | Train Loss 0.3815 | Val Accuracy 0.3533 Epoch 00064 | Time(s) 0.0390 | Train Loss 0.3363 | Val Accuracy 0.3567 Epoch 00065 | Time(s) 0.0389 | Train Loss 0.3985 | Val Accuracy 0.3600 Epoch 00066 | Time(s) 0.0388 | Train Loss 0.2889 | Val Accuracy 0.3600 Epoch 00067 | Time(s) 0.0387 | Train Loss 0.4435 | Val Accuracy 0.3633 Epoch 00068 | Time(s) 0.0386 | Train Loss 0.2890 | Val Accuracy 0.3700 Epoch 00069 | Time(s) 0.0385 | Train Loss 0.4004 | Val Accuracy 0.3733 Epoch 00070 | Time(s) 0.0385 | Train Loss 0.4397 | Val Accuracy 0.3733 Epoch 00071 | Time(s) 0.0383 | Train Loss 0.3800 | Val Accuracy 0.3700 Epoch 00072 | Time(s) 0.0382 | Train Loss 0.3402 | Val Accuracy 0.3767 Epoch 00073 | Time(s) 0.0380 | Train Loss 0.2631 | Val Accuracy 0.3800 Epoch 00074 | Time(s) 0.0378 | Train Loss 0.3356 | Val Accuracy 0.3800 Epoch 00075 | Time(s) 0.0378 | Train Loss 0.3678 | Val Accuracy 0.3800 Epoch 00076 | Time(s) 0.0376 | Train Loss 0.2726 | Val Accuracy 0.3967 Epoch 00077 | Time(s) 0.0374 | Train Loss 0.3200 | Val Accuracy 0.4000 Epoch 00078 | Time(s) 0.0372 | Train Loss 0.4084 | Val Accuracy 0.3967 Epoch 00079 | Time(s) 0.0371 | Train Loss 0.2667 | Val Accuracy 0.3933 Epoch 00080 | Time(s) 0.0369 | Train Loss 0.3814 | Val Accuracy 0.3967 Epoch 00081 | Time(s) 0.0368 | Train Loss 0.2594 | Val Accuracy 0.3967 Epoch 00082 | Time(s) 0.0366 | Train Loss 0.2875 | Val Accuracy 0.4000 Epoch 00083 | Time(s) 0.0365 | Train Loss 0.3133 | Val Accuracy 0.3933 Epoch 00084 | Time(s) 0.0363 | Train Loss 0.3002 | Val Accuracy 0.3900 Epoch 00085 | Time(s) 0.0363 | Train Loss 0.3026 | Val Accuracy 0.3967 Epoch 00086 | Time(s) 0.0362 | Train Loss 0.3240 | Val Accuracy 0.4033 Epoch 00087 | Time(s) 0.0361 | Train Loss 0.3446 | Val Accuracy 0.4000 Epoch 00088 | Time(s) 0.0361 | Train Loss 0.2963 | Val Accuracy 0.4000 Epoch 00089 | Time(s) 0.0359 | Train Loss 0.2259 | Val Accuracy 0.4033 Epoch 00090 | Time(s) 0.0358 | Train Loss 0.2313 | Val Accuracy 0.3967 Epoch 00091 | Time(s) 0.0357 | Train Loss 0.3496 | Val Accuracy 0.3933 Epoch 00092 | Time(s) 0.0356 | Train Loss 0.2584 | Val Accuracy 0.3900 Epoch 00093 | Time(s) 0.0355 | Train Loss 0.2706 | Val Accuracy 0.3900 Epoch 00094 | Time(s) 0.0354 | Train Loss 0.2953 | Val Accuracy 0.3900 Epoch 00095 | Time(s) 0.0354 | Train Loss 0.4489 | Val Accuracy 0.3967 Epoch 00096 | Time(s) 0.0353 | Train Loss 0.3128 | Val Accuracy 0.4167 Epoch 00097 | Time(s) 0.0352 | Train Loss 0.2181 | Val Accuracy 0.4233 Epoch 00098 | Time(s) 0.0351 | Train Loss 0.2781 | Val Accuracy 0.4400 Epoch 00099 | Time(s) 0.0350 | Train Loss 0.2628 | Val Accuracy 0.4367 Epoch 00100 | Time(s) 0.0349 | Train Loss 0.2553 | Val Accuracy 0.4467 Epoch 00101 | Time(s) 0.0348 | Train Loss 0.2259 | Val Accuracy 0.4433 Epoch 00102 | Time(s) 0.0347 | Train Loss 0.2217 | Val Accuracy 0.4367 Epoch 00103 | Time(s) 0.0346 | Train Loss 0.2929 | Val Accuracy 0.4367 Epoch 00104 | Time(s) 0.0346 | Train Loss 0.2741 | Val Accuracy 0.4433 Epoch 00105 | Time(s) 0.0345 | Train Loss 0.3085 | Val Accuracy 0.4467 Epoch 00106 | Time(s) 0.0344 | Train Loss 0.2734 | Val Accuracy 0.4467 Epoch 00107 | Time(s) 0.0343 | Train Loss 0.2792 | Val Accuracy 0.4500 Epoch 00108 | Time(s) 0.0342 | Train Loss 0.2303 | Val Accuracy 0.4567 Epoch 00109 | Time(s) 0.0341 | Train Loss 0.1978 | Val Accuracy 0.4600 Epoch 00110 | Time(s) 0.0341 | Train Loss 0.3651 | Val Accuracy 0.4700 Epoch 00111 | Time(s) 0.0340 | Train Loss 0.2440 | Val Accuracy 0.4800 Epoch 00112 | Time(s) 0.0340 | Train Loss 0.2693 | Val Accuracy 0.4800 Epoch 00113 | Time(s) 0.0339 | Train Loss 0.2340 | Val Accuracy 0.4800 Epoch 00114 | Time(s) 0.0338 | Train Loss 0.2812 | Val Accuracy 0.4867 Epoch 00115 | Time(s) 0.0337 | Train Loss 0.2674 | Val Accuracy 0.4933 Epoch 00116 | Time(s) 0.0337 | Train Loss 0.1922 | Val Accuracy 0.4933 Epoch 00117 | Time(s) 0.0337 | Train Loss 0.2064 | Val Accuracy 0.4933 Epoch 00118 | Time(s) 0.0336 | Train Loss 0.2180 | Val Accuracy 0.4900 Epoch 00119 | Time(s) 0.0335 | Train Loss 0.3112 | Val Accuracy 0.4867 Epoch 00120 | Time(s) 0.0335 | Train Loss 0.1790 | Val Accuracy 0.4767 Epoch 00121 | Time(s) 0.0335 | Train Loss 0.1252 | Val Accuracy 0.4733 Epoch 00122 | Time(s) 0.0334 | Train Loss 0.1799 | Val Accuracy 0.4800 Epoch 00123 | Time(s) 0.0333 | Train Loss 0.1971 | Val Accuracy 0.4833 Epoch 00124 | Time(s) 0.0333 | Train Loss 0.1703 | Val Accuracy 0.4833 Epoch 00125 | Time(s) 0.0332 | Train Loss 0.1530 | Val Accuracy 0.4833 Epoch 00126 | Time(s) 0.0332 | Train Loss 0.1862 | Val Accuracy 0.5067 Epoch 00127 | Time(s) 0.0332 | Train Loss 0.1641 | Val Accuracy 0.5200 Epoch 00128 | Time(s) 0.0331 | Train Loss 0.2754 | Val Accuracy 0.5233 Epoch 00129 | Time(s) 0.0330 | Train Loss 0.2412 | Val Accuracy 0.5433 Epoch 00130 | Time(s) 0.0330 | Train Loss 0.2384 | Val Accuracy 0.5467 Epoch 00131 | Time(s) 0.0330 | Train Loss 0.1321 | Val Accuracy 0.5633 Epoch 00132 | Time(s) 0.0330 | Train Loss 0.1747 | Val Accuracy 0.5467 Epoch 00133 | Time(s) 0.0330 | Train Loss 0.1293 | Val Accuracy 0.5300 Epoch 00134 | Time(s) 0.0330 | Train Loss 0.1605 | Val Accuracy 0.5167 Epoch 00135 | Time(s) 0.0330 | Train Loss 0.1850 | Val Accuracy 0.5033 Epoch 00136 | Time(s) 0.0330 | Train Loss 0.1121 | Val Accuracy 0.4967 Epoch 00137 | Time(s) 0.0330 | Train Loss 0.2143 | Val Accuracy 0.4967 Epoch 00138 | Time(s) 0.0329 | Train Loss 0.2201 | Val Accuracy 0.5033 Epoch 00139 | Time(s) 0.0329 | Train Loss 0.2410 | Val Accuracy 0.5167 Epoch 00140 | Time(s) 0.0328 | Train Loss 0.1930 | Val Accuracy 0.5233 Epoch 00141 | Time(s) 0.0327 | Train Loss 0.1996 | Val Accuracy 0.5367 Epoch 00142 | Time(s) 0.0327 | Train Loss 0.1125 | Val Accuracy 0.5533 Epoch 00143 | Time(s) 0.0326 | Train Loss 0.1798 | Val Accuracy 0.5767 Epoch 00144 | Time(s) 0.0326 | Train Loss 0.1353 | Val Accuracy 0.5833 Epoch 00145 | Time(s) 0.0326 | Train Loss 0.2692 | Val Accuracy 0.5833 Epoch 00146 | Time(s) 0.0325 | Train Loss 0.1590 | Val Accuracy 0.5800 Epoch 00147 | Time(s) 0.0325 | Train Loss 0.1583 | Val Accuracy 0.5700 Epoch 00148 | Time(s) 0.0324 | Train Loss 0.1758 | Val Accuracy 0.5500 Epoch 00149 | Time(s) 0.0324 | Train Loss 0.1695 | Val Accuracy 0.5433 Epoch 00150 | Time(s) 0.0323 | Train Loss 0.0978 | Val Accuracy 0.5367 Epoch 00151 | Time(s) 0.0323 | Train Loss 0.1717 | Val Accuracy 0.5300 Epoch 00152 | Time(s) 0.0323 | Train Loss 0.1541 | Val Accuracy 0.5400 Epoch 00153 | Time(s) 0.0323 | Train Loss 0.0948 | Val Accuracy 0.5533 Epoch 00154 | Time(s) 0.0323 | Train Loss 0.1603 | Val Accuracy 0.5600 Epoch 00155 | Time(s) 0.0323 | Train Loss 0.1102 | Val Accuracy 0.5700 Epoch 00156 | Time(s) 0.0324 | Train Loss 0.1169 | Val Accuracy 0.5867 Epoch 00157 | Time(s) 0.0324 | Train Loss 0.1266 | Val Accuracy 0.6067 Epoch 00158 | Time(s) 0.0324 | Train Loss 0.1048 | Val Accuracy 0.6133 Epoch 00159 | Time(s) 0.0325 | Train Loss 0.1205 | Val Accuracy 0.6133 Epoch 00160 | Time(s) 0.0326 | Train Loss 0.1548 | Val Accuracy 0.6133 Epoch 00161 | Time(s) 0.0327 | Train Loss 0.1888 | Val Accuracy 0.6300 Epoch 00162 | Time(s) 0.0328 | Train Loss 0.1792 | Val Accuracy 0.6467 Epoch 00163 | Time(s) 0.0328 | Train Loss 0.1414 | Val Accuracy 0.6467 Epoch 00164 | Time(s) 0.0327 | Train Loss 0.1374 | Val Accuracy 0.6467 Epoch 00165 | Time(s) 0.0327 | Train Loss 0.1400 | Val Accuracy 0.6400 Epoch 00166 | Time(s) 0.0327 | Train Loss 0.0553 | Val Accuracy 0.6367 Epoch 00167 | Time(s) 0.0327 | Train Loss 0.0831 | Val Accuracy 0.6267 Epoch 00168 | Time(s) 0.0327 | Train Loss 0.1537 | Val Accuracy 0.6233 Epoch 00169 | Time(s) 0.0326 | Train Loss 0.0971 | Val Accuracy 0.6233 Epoch 00170 | Time(s) 0.0326 | Train Loss 0.1230 | Val Accuracy 0.6233 Epoch 00171 | Time(s) 0.0326 | Train Loss 0.1725 | Val Accuracy 0.6167 Epoch 00172 | Time(s) 0.0326 | Train Loss 0.1725 | Val Accuracy 0.6367 Epoch 00173 | Time(s) 0.0325 | Train Loss 0.1242 | Val Accuracy 0.6467 Epoch 00174 | Time(s) 0.0325 | Train Loss 0.1305 | Val Accuracy 0.6600 Epoch 00175 | Time(s) 0.0326 | Train Loss 0.0918 | Val Accuracy 0.6667 Epoch 00176 | Time(s) 0.0326 | Train Loss 0.1130 | Val Accuracy 0.6700 Epoch 00177 | Time(s) 0.0326 | Train Loss 0.1130 | Val Accuracy 0.6767 Epoch 00178 | Time(s) 0.0325 | Train Loss 0.0965 | Val Accuracy 0.6733 Epoch 00179 | Time(s) 0.0325 | Train Loss 0.1199 | Val Accuracy 0.6633 Epoch 00180 | Time(s) 0.0325 | Train Loss 0.0929 | Val Accuracy 0.6600 Epoch 00181 | Time(s) 0.0325 | Train Loss 0.0921 | Val Accuracy 0.6667 Epoch 00182 | Time(s) 0.0325 | Train Loss 0.0308 | Val Accuracy 0.6733 Epoch 00183 | Time(s) 0.0325 | Train Loss 0.0740 | Val Accuracy 0.6700 Epoch 00184 | Time(s) 0.0324 | Train Loss 0.1181 | Val Accuracy 0.6700 Epoch 00185 | Time(s) 0.0324 | Train Loss 0.0830 | Val Accuracy 0.6667 Epoch 00186 | Time(s) 0.0324 | Train Loss 0.1247 | Val Accuracy 0.6800 Epoch 00187 | Time(s) 0.0324 | Train Loss 0.0931 | Val Accuracy 0.6833 Epoch 00188 | Time(s) 0.0323 | Train Loss 0.0670 | Val Accuracy 0.6900 Epoch 00189 | Time(s) 0.0324 | Train Loss 0.0738 | Val Accuracy 0.6933 Epoch 00190 | Time(s) 0.0323 | Train Loss 0.0672 | Val Accuracy 0.6933 Epoch 00191 | Time(s) 0.0323 | Train Loss 0.1117 | Val Accuracy 0.6867 Epoch 00192 | Time(s) 0.0323 | Train Loss 0.1166 | Val Accuracy 0.6900 Epoch 00193 | Time(s) 0.0323 | Train Loss 0.1055 | Val Accuracy 0.6933 Epoch 00194 | Time(s) 0.0323 | Train Loss 0.1552 | Val Accuracy 0.6933 Epoch 00195 | Time(s) 0.0323 | Train Loss 0.0720 | Val Accuracy 0.7000 Epoch 00196 | Time(s) 0.0323 | Train Loss 0.0540 | Val Accuracy 0.7033 Epoch 00197 | Time(s) 0.0323 | Train Loss 0.1594 | Val Accuracy 0.7033 Epoch 00198 | Time(s) 0.0324 | Train Loss 0.1094 | Val Accuracy 0.6967 Epoch 00199 | Time(s) 0.0324 | Train Loss 0.1150 | Val Accuracy 0.6933 Epoch 00200 | Time(s) 0.0325 | Train Loss 0.0666 | Val Accuracy 0.6933 Epoch 00201 | Time(s) 0.0325 | Train Loss 0.0890 | Val Accuracy 0.6967 Epoch 00202 | Time(s) 0.0325 | Train Loss 0.1232 | Val Accuracy 0.7000 Epoch 00203 | Time(s) 0.0325 | Train Loss 0.1002 | Val Accuracy 0.6967 Epoch 00204 | Time(s) 0.0325 | Train Loss 0.1411 | Val Accuracy 0.7000 Epoch 00205 | Time(s) 0.0325 | Train Loss 0.0743 | Val Accuracy 0.7000 Epoch 00206 | Time(s) 0.0324 | Train Loss 0.0868 | Val Accuracy 0.7067 Epoch 00207 | Time(s) 0.0324 | Train Loss 0.0697 | Val Accuracy 0.7067 Epoch 00208 | Time(s) 0.0324 | Train Loss 0.1170 | Val Accuracy 0.7067 Epoch 00209 | Time(s) 0.0324 | Train Loss 0.0784 | Val Accuracy 0.7067 Epoch 00210 | Time(s) 0.0324 | Train Loss 0.1013 | Val Accuracy 0.7000 Epoch 00211 | Time(s) 0.0323 | Train Loss 0.1855 | Val Accuracy 0.7033 Epoch 00212 | Time(s) 0.0323 | Train Loss 0.0809 | Val Accuracy 0.7000 Epoch 00213 | Time(s) 0.0323 | Train Loss 0.0695 | Val Accuracy 0.7000 Epoch 00214 | Time(s) 0.0323 | Train Loss 0.1139 | Val Accuracy 0.7067 Epoch 00215 | Time(s) 0.0323 | Train Loss 0.0884 | Val Accuracy 0.7067 Epoch 00216 | Time(s) 0.0323 | Train Loss 0.1466 | Val Accuracy 0.7067 Epoch 00217 | Time(s) 0.0323 | Train Loss 0.0456 | Val Accuracy 0.7067 Epoch 00218 | Time(s) 0.0323 | Train Loss 0.0444 | Val Accuracy 0.7033 Epoch 00219 | Time(s) 0.0323 | Train Loss 0.0623 | Val Accuracy 0.7067 Epoch 00220 | Time(s) 0.0323 | Train Loss 0.0988 | Val Accuracy 0.7133 Epoch 00221 | Time(s) 0.0323 | Train Loss 0.0691 | Val Accuracy 0.7200 Epoch 00222 | Time(s) 0.0323 | Train Loss 0.0909 | Val Accuracy 0.7167 Epoch 00223 | Time(s) 0.0323 | Train Loss 0.0908 | Val Accuracy 0.7200 Epoch 00224 | Time(s) 0.0322 | Train Loss 0.1029 | Val Accuracy 0.7233 Epoch 00225 | Time(s) 0.0322 | Train Loss 0.0547 | Val Accuracy 0.7233 Epoch 00226 | Time(s) 0.0322 | Train Loss 0.0729 | Val Accuracy 0.7200 Epoch 00227 | Time(s) 0.0322 | Train Loss 0.0882 | Val Accuracy 0.7200 Epoch 00228 | Time(s) 0.0321 | Train Loss 0.0692 | Val Accuracy 0.7200 Epoch 00229 | Time(s) 0.0321 | Train Loss 0.0895 | Val Accuracy 0.7200 Epoch 00230 | Time(s) 0.0321 | Train Loss 0.0578 | Val Accuracy 0.7233 Epoch 00231 | Time(s) 0.0321 | Train Loss 0.1002 | Val Accuracy 0.7200 Epoch 00232 | Time(s) 0.0320 | Train Loss 0.0860 | Val Accuracy 0.7167 Epoch 00233 | Time(s) 0.0320 | Train Loss 0.0423 | Val Accuracy 0.7133 Epoch 00234 | Time(s) 0.0320 | Train Loss 0.1385 | Val Accuracy 0.7200 Epoch 00235 | Time(s) 0.0320 | Train Loss 0.1209 | Val Accuracy 0.7233 Epoch 00236 | Time(s) 0.0320 | Train Loss 0.0649 | Val Accuracy 0.7267 Epoch 00237 | Time(s) 0.0320 | Train Loss 0.0521 | Val Accuracy 0.7233 Epoch 00238 | Time(s) 0.0319 | Train Loss 0.0492 | Val Accuracy 0.7233 Epoch 00239 | Time(s) 0.0319 | Train Loss 0.0792 | Val Accuracy 0.7233 Epoch 00240 | Time(s) 0.0319 | Train Loss 0.0556 | Val Accuracy 0.7233 Epoch 00241 | Time(s) 0.0319 | Train Loss 0.0382 | Val Accuracy 0.7200 Epoch 00242 | Time(s) 0.0319 | Train Loss 0.0947 | Val Accuracy 0.7200 Epoch 00243 | Time(s) 0.0319 | Train Loss 0.0524 | Val Accuracy 0.7267 Epoch 00244 | Time(s) 0.0319 | Train Loss 0.0449 | Val Accuracy 0.7200 Epoch 00245 | Time(s) 0.0319 | Train Loss 0.0579 | Val Accuracy 0.7200 Epoch 00246 | Time(s) 0.0319 | Train Loss 0.1111 | Val Accuracy 0.7200 Epoch 00247 | Time(s) 0.0319 | Train Loss 0.0614 | Val Accuracy 0.7200 Epoch 00248 | Time(s) 0.0319 | Train Loss 0.1068 | Val Accuracy 0.7233 Epoch 00249 | Time(s) 0.0319 | Train Loss 0.0222 | Val Accuracy 0.7233 Epoch 00250 | Time(s) 0.0319 | Train Loss 0.1009 | Val Accuracy 0.7267 Epoch 00251 | Time(s) 0.0319 | Train Loss 0.0750 | Val Accuracy 0.7300 Epoch 00252 | Time(s) 0.0319 | Train Loss 0.0860 | Val Accuracy 0.7300 Epoch 00253 | Time(s) 0.0319 | Train Loss 0.0453 | Val Accuracy 0.7267 Epoch 00254 | Time(s) 0.0319 | Train Loss 0.0765 | Val Accuracy 0.7333 Epoch 00255 | Time(s) 0.0319 | Train Loss 0.0535 | Val Accuracy 0.7300 Epoch 00256 | Time(s) 0.0319 | Train Loss 0.0808 | Val Accuracy 0.7300 Epoch 00257 | Time(s) 0.0319 | Train Loss 0.0988 | Val Accuracy 0.7267 Epoch 00258 | Time(s) 0.0319 | Train Loss 0.0651 | Val Accuracy 0.7200 Epoch 00259 | Time(s) 0.0319 | Train Loss 0.0450 | Val Accuracy 0.7133 Epoch 00260 | Time(s) 0.0319 | Train Loss 0.0628 | Val Accuracy 0.7200 Epoch 00261 | Time(s) 0.0319 | Train Loss 0.0754 | Val Accuracy 0.7233 Epoch 00262 | Time(s) 0.0319 | Train Loss 0.0456 | Val Accuracy 0.7267 Epoch 00263 | Time(s) 0.0319 | Train Loss 0.0334 | Val Accuracy 0.7300 Epoch 00264 | Time(s) 0.0319 | Train Loss 0.0360 | Val Accuracy 0.7333 Epoch 00265 | Time(s) 0.0319 | Train Loss 0.0703 | Val Accuracy 0.7400 Epoch 00266 | Time(s) 0.0319 | Train Loss 0.0280 | Val Accuracy 0.7433 Epoch 00267 | Time(s) 0.0319 | Train Loss 0.0815 | Val Accuracy 0.7467 Epoch 00268 | Time(s) 0.0319 | Train Loss 0.0284 | Val Accuracy 0.7467 Epoch 00269 | Time(s) 0.0319 | Train Loss 0.0599 | Val Accuracy 0.7533 Epoch 00270 | Time(s) 0.0319 | Train Loss 0.0465 | Val Accuracy 0.7567 Epoch 00271 | Time(s) 0.0319 | Train Loss 0.0612 | Val Accuracy 0.7567 Epoch 00272 | Time(s) 0.0320 | Train Loss 0.0833 | Val Accuracy 0.7500 Epoch 00273 | Time(s) 0.0321 | Train Loss 0.0509 | Val Accuracy 0.7433 Epoch 00274 | Time(s) 0.0321 | Train Loss 0.0620 | Val Accuracy 0.7433 Epoch 00275 | Time(s) 0.0322 | Train Loss 0.0516 | Val Accuracy 0.7433 Epoch 00276 | Time(s) 0.0322 | Train Loss 0.0651 | Val Accuracy 0.7367 Epoch 00277 | Time(s) 0.0321 | Train Loss 0.0477 | Val Accuracy 0.7367 Epoch 00278 | Time(s) 0.0321 | Train Loss 0.0753 | Val Accuracy 0.7300 Epoch 00279 | Time(s) 0.0321 | Train Loss 0.0578 | Val Accuracy 0.7300 Epoch 00280 | Time(s) 0.0321 | Train Loss 0.0619 | Val Accuracy 0.7333 Epoch 00281 | Time(s) 0.0321 | Train Loss 0.0320 | Val Accuracy 0.7333 Epoch 00282 | Time(s) 0.0321 | Train Loss 0.0338 | Val Accuracy 0.7367 Epoch 00283 | Time(s) 0.0321 | Train Loss 0.0455 | Val Accuracy 0.7333 Epoch 00284 | Time(s) 0.0322 | Train Loss 0.0532 | Val Accuracy 0.7300 Epoch 00285 | Time(s) 0.0322 | Train Loss 0.0553 | Val Accuracy 0.7300 Epoch 00286 | Time(s) 0.0322 | Train Loss 0.0807 | Val Accuracy 0.7233 Epoch 00287 | Time(s) 0.0323 | Train Loss 0.0969 | Val Accuracy 0.7233 Epoch 00288 | Time(s) 0.0323 | Train Loss 0.0645 | Val Accuracy 0.7300 Epoch 00289 | Time(s) 0.0323 | Train Loss 0.0313 | Val Accuracy 0.7333 Epoch 00290 | Time(s) 0.0324 | Train Loss 0.0529 | Val Accuracy 0.7433 Epoch 00291 | Time(s) 0.0324 | Train Loss 0.0514 | Val Accuracy 0.7500 Epoch 00292 | Time(s) 0.0324 | Train Loss 0.0531 | Val Accuracy 0.7500 Epoch 00293 | Time(s) 0.0324 | Train Loss 0.0442 | Val Accuracy 0.7533 Epoch 00294 | Time(s) 0.0324 | Train Loss 0.0854 | Val Accuracy 0.7533 Epoch 00295 | Time(s) 0.0324 | Train Loss 0.1049 | Val Accuracy 0.7633 Epoch 00296 | Time(s) 0.0324 | Train Loss 0.0693 | Val Accuracy 0.7533 Epoch 00297 | Time(s) 0.0324 | Train Loss 0.1049 | Val Accuracy 0.7400 Epoch 00298 | Time(s) 0.0324 | Train Loss 0.0257 | Val Accuracy 0.7200 Epoch 00299 | Time(s) 0.0324 | Train Loss 0.0874 | Val Accuracy 0.7167 Epoch 00300 | Time(s) 0.0324 | Train Loss 0.0697 | Val Accuracy 0.7167 Epoch 00301 | Time(s) 0.0324 | Train Loss 0.0739 | Val Accuracy 0.7133 Epoch 00302 | Time(s) 0.0324 | Train Loss 0.0417 | Val Accuracy 0.7133 Epoch 00303 | Time(s) 0.0324 | Train Loss 0.0431 | Val Accuracy 0.7100 Epoch 00304 | Time(s) 0.0324 | Train Loss 0.0855 | Val Accuracy 0.7167 Epoch 00305 | Time(s) 0.0324 | Train Loss 0.0712 | Val Accuracy 0.7300 Epoch 00306 | Time(s) 0.0324 | Train Loss 0.0978 | Val Accuracy 0.7500 Epoch 00307 | Time(s) 0.0324 | Train Loss 0.0397 | Val Accuracy 0.7533 Epoch 00308 | Time(s) 0.0324 | Train Loss 0.0421 | Val Accuracy 0.7567 Epoch 00309 | Time(s) 0.0324 | Train Loss 0.0381 | Val Accuracy 0.7600 Epoch 00310 | Time(s) 0.0324 | Train Loss 0.0733 | Val Accuracy 0.7633 Epoch 00311 | Time(s) 0.0324 | Train Loss 0.0416 | Val Accuracy 0.7633 Epoch 00312 | Time(s) 0.0325 | Train Loss 0.0651 | Val Accuracy 0.7633 Epoch 00313 | Time(s) 0.0325 | Train Loss 0.0426 | Val Accuracy 0.7600 Epoch 00314 | Time(s) 0.0326 | Train Loss 0.0584 | Val Accuracy 0.7600 Epoch 00315 | Time(s) 0.0326 | Train Loss 0.0598 | Val Accuracy 0.7600 Epoch 00316 | Time(s) 0.0327 | Train Loss 0.0627 | Val Accuracy 0.7633 Epoch 00317 | Time(s) 0.0327 | Train Loss 0.0562 | Val Accuracy 0.7767 Epoch 00318 | Time(s) 0.0328 | Train Loss 0.0626 | Val Accuracy 0.7733 Epoch 00319 | Time(s) 0.0328 | Train Loss 0.0747 | Val Accuracy 0.7767 Epoch 00320 | Time(s) 0.0328 | Train Loss 0.0472 | Val Accuracy 0.7800 Epoch 00321 | Time(s) 0.0329 | Train Loss 0.0500 | Val Accuracy 0.7767 Epoch 00322 | Time(s) 0.0329 | Train Loss 0.0675 | Val Accuracy 0.7800 Epoch 00323 | Time(s) 0.0329 | Train Loss 0.0385 | Val Accuracy 0.7800 Epoch 00324 | Time(s) 0.0330 | Train Loss 0.0663 | Val Accuracy 0.7800 Epoch 00325 | Time(s) 0.0330 | Train Loss 0.0879 | Val Accuracy 0.7800 Epoch 00326 | Time(s) 0.0330 | Train Loss 0.0477 | Val Accuracy 0.7800 Epoch 00327 | Time(s) 0.0331 | Train Loss 0.0375 | Val Accuracy 0.7800 Epoch 00328 | Time(s) 0.0331 | Train Loss 0.0833 | Val Accuracy 0.7800 Epoch 00329 | Time(s) 0.0331 | Train Loss 0.0379 | Val Accuracy 0.7800 Epoch 00330 | Time(s) 0.0331 | Train Loss 0.0273 | Val Accuracy 0.7700 Epoch 00331 | Time(s) 0.0333 | Train Loss 0.1315 | Val Accuracy 0.7767 Epoch 00332 | Time(s) 0.0334 | Train Loss 0.0487 | Val Accuracy 0.7767 Epoch 00333 | Time(s) 0.0335 | Train Loss 0.0602 | Val Accuracy 0.7733 Epoch 00334 | Time(s) 0.0336 | Train Loss 0.0372 | Val Accuracy 0.7700 Epoch 00335 | Time(s) 0.0337 | Train Loss 0.0985 | Val Accuracy 0.7667 Epoch 00336 | Time(s) 0.0338 | Train Loss 0.0779 | Val Accuracy 0.7700 Epoch 00337 | Time(s) 0.0339 | Train Loss 0.0870 | Val Accuracy 0.7700 Epoch 00338 | Time(s) 0.0339 | Train Loss 0.0535 | Val Accuracy 0.7700 Epoch 00339 | Time(s) 0.0339 | Train Loss 0.0361 | Val Accuracy 0.7700 Epoch 00340 | Time(s) 0.0339 | Train Loss 0.0129 | Val Accuracy 0.7733 Epoch 00341 | Time(s) 0.0339 | Train Loss 0.0954 | Val Accuracy 0.7700 Epoch 00342 | Time(s) 0.0339 | Train Loss 0.0672 | Val Accuracy 0.7700 Epoch 00343 | Time(s) 0.0339 | Train Loss 0.0573 | Val Accuracy 0.7667 Epoch 00344 | Time(s) 0.0339 | Train Loss 0.0653 | Val Accuracy 0.7633 Epoch 00345 | Time(s) 0.0339 | Train Loss 0.0327 | Val Accuracy 0.7633 Epoch 00346 | Time(s) 0.0339 | Train Loss 0.0426 | Val Accuracy 0.7667 Epoch 00347 | Time(s) 0.0339 | Train Loss 0.0411 | Val Accuracy 0.7567 Epoch 00348 | Time(s) 0.0338 | Train Loss 0.0716 | Val Accuracy 0.7600 Epoch 00349 | Time(s) 0.0338 | Train Loss 0.0396 | Val Accuracy 0.7600 Epoch 00350 | Time(s) 0.0338 | Train Loss 0.0397 | Val Accuracy 0.7567 Epoch 00351 | Time(s) 0.0338 | Train Loss 0.0282 | Val Accuracy 0.7567 Epoch 00352 | Time(s) 0.0338 | Train Loss 0.0781 | Val Accuracy 0.7533 Epoch 00353 | Time(s) 0.0338 | Train Loss 0.0456 | Val Accuracy 0.7533 Epoch 00354 | Time(s) 0.0338 | Train Loss 0.0419 | Val Accuracy 0.7567 Epoch 00355 | Time(s) 0.0338 | Train Loss 0.0921 | Val Accuracy 0.7533 Epoch 00356 | Time(s) 0.0338 | Train Loss 0.0557 | Val Accuracy 0.7533 Epoch 00357 | Time(s) 0.0338 | Train Loss 0.0482 | Val Accuracy 0.7567 Epoch 00358 | Time(s) 0.0338 | Train Loss 0.0400 | Val Accuracy 0.7567 Epoch 00359 | Time(s) 0.0338 | Train Loss 0.0522 | Val Accuracy 0.7567 Epoch 00360 | Time(s) 0.0338 | Train Loss 0.0915 | Val Accuracy 0.7633 Epoch 00361 | Time(s) 0.0338 | Train Loss 0.0809 | Val Accuracy 0.7667 Epoch 00362 | Time(s) 0.0337 | Train Loss 0.0685 | Val Accuracy 0.7700 Epoch 00363 | Time(s) 0.0337 | Train Loss 0.0412 | Val Accuracy 0.7733 Epoch 00364 | Time(s) 0.0337 | Train Loss 0.0600 | Val Accuracy 0.7800 Epoch 00365 | Time(s) 0.0337 | Train Loss 0.0816 | Val Accuracy 0.7767 Epoch 00366 | Time(s) 0.0337 | Train Loss 0.0722 | Val Accuracy 0.7733 Epoch 00367 | Time(s) 0.0337 | Train Loss 0.0978 | Val Accuracy 0.7733 Epoch 00368 | Time(s) 0.0336 | Train Loss 0.0363 | Val Accuracy 0.7733 Epoch 00369 | Time(s) 0.0336 | Train Loss 0.0350 | Val Accuracy 0.7733 Epoch 00370 | Time(s) 0.0337 | Train Loss 0.0577 | Val Accuracy 0.7667 Epoch 00371 | Time(s) 0.0337 | Train Loss 0.0490 | Val Accuracy 0.7633 Epoch 00372 | Time(s) 0.0337 | Train Loss 0.1220 | Val Accuracy 0.7667 Epoch 00373 | Time(s) 0.0337 | Train Loss 0.0696 | Val Accuracy 0.7533 Epoch 00374 | Time(s) 0.0337 | Train Loss 0.0723 | Val Accuracy 0.7533 Epoch 00375 | Time(s) 0.0338 | Train Loss 0.1220 | Val Accuracy 0.7567 Epoch 00376 | Time(s) 0.0338 | Train Loss 0.0374 | Val Accuracy 0.7567 Epoch 00377 | Time(s) 0.0338 | Train Loss 0.0702 | Val Accuracy 0.7533 Epoch 00378 | Time(s) 0.0338 | Train Loss 0.0928 | Val Accuracy 0.7533 Epoch 00379 | Time(s) 0.0338 | Train Loss 0.0523 | Val Accuracy 0.7500 Epoch 00380 | Time(s) 0.0339 | Train Loss 0.0710 | Val Accuracy 0.7500 Epoch 00381 | Time(s) 0.0339 | Train Loss 0.0704 | Val Accuracy 0.7500 Epoch 00382 | Time(s) 0.0339 | Train Loss 0.0577 | Val Accuracy 0.7500 Epoch 00383 | Time(s) 0.0339 | Train Loss 0.0707 | Val Accuracy 0.7633 Epoch 00384 | Time(s) 0.0340 | Train Loss 0.0580 | Val Accuracy 0.7667 Epoch 00385 | Time(s) 0.0340 | Train Loss 0.0677 | Val Accuracy 0.7700 Epoch 00386 | Time(s) 0.0340 | Train Loss 0.0571 | Val Accuracy 0.7700 Epoch 00387 | Time(s) 0.0340 | Train Loss 0.0343 | Val Accuracy 0.7767 Epoch 00388 | Time(s) 0.0340 | Train Loss 0.0491 | Val Accuracy 0.7800 Epoch 00389 | Time(s) 0.0340 | Train Loss 0.0302 | Val Accuracy 0.7733 Epoch 00390 | Time(s) 0.0340 | Train Loss 0.1066 | Val Accuracy 0.7800 Epoch 00391 | Time(s) 0.0340 | Train Loss 0.0370 | Val Accuracy 0.7833 Epoch 00392 | Time(s) 0.0339 | Train Loss 0.0258 | Val Accuracy 0.7800 Epoch 00393 | Time(s) 0.0339 | Train Loss 0.0304 | Val Accuracy 0.7800 Epoch 00394 | Time(s) 0.0339 | Train Loss 0.0883 | Val Accuracy 0.7767 Epoch 00395 | Time(s) 0.0339 | Train Loss 0.0449 | Val Accuracy 0.7733 Epoch 00396 | Time(s) 0.0339 | Train Loss 0.0405 | Val Accuracy 0.7733 Epoch 00397 | Time(s) 0.0338 | Train Loss 0.0417 | Val Accuracy 0.7733 Epoch 00398 | Time(s) 0.0338 | Train Loss 0.0853 | Val Accuracy 0.7733 Epoch 00399 | Time(s) 0.0338 | Train Loss 0.0634 | Val Accuracy 0.7733 Epoch 00400 | Time(s) 0.0338 | Train Loss 0.0567 | Val Accuracy 0.7733 Epoch 00401 | Time(s) 0.0338 | Train Loss 0.0451 | Val Accuracy 0.7733 Epoch 00402 | Time(s) 0.0337 | Train Loss 0.0627 | Val Accuracy 0.7700 Epoch 00403 | Time(s) 0.0337 | Train Loss 0.0781 | Val Accuracy 0.7767 Epoch 00404 | Time(s) 0.0337 | Train Loss 0.0794 | Val Accuracy 0.7733 Epoch 00405 | Time(s) 0.0337 | Train Loss 0.0382 | Val Accuracy 0.7733 Epoch 00406 | Time(s) 0.0337 | Train Loss 0.0408 | Val Accuracy 0.7733 Epoch 00407 | Time(s) 0.0337 | Train Loss 0.0422 | Val Accuracy 0.7733 Epoch 00408 | Time(s) 0.0336 | Train Loss 0.0603 | Val Accuracy 0.7733 Epoch 00409 | Time(s) 0.0336 | Train Loss 0.0489 | Val Accuracy 0.7767 Epoch 00410 | Time(s) 0.0336 | Train Loss 0.0255 | Val Accuracy 0.7733 Epoch 00411 | Time(s) 0.0336 | Train Loss 0.0266 | Val Accuracy 0.7733 Epoch 00412 | Time(s) 0.0336 | Train Loss 0.0429 | Val Accuracy 0.7733 Epoch 00413 | Time(s) 0.0336 | Train Loss 0.1291 | Val Accuracy 0.7700 Epoch 00414 | Time(s) 0.0336 | Train Loss 0.0266 | Val Accuracy 0.7633 Epoch 00415 | Time(s) 0.0336 | Train Loss 0.0582 | Val Accuracy 0.7600 Epoch 00416 | Time(s) 0.0335 | Train Loss 0.0096 | Val Accuracy 0.7633 Epoch 00417 | Time(s) 0.0335 | Train Loss 0.0227 | Val Accuracy 0.7600 Epoch 00418 | Time(s) 0.0335 | Train Loss 0.0324 | Val Accuracy 0.7600 Epoch 00419 | Time(s) 0.0335 | Train Loss 0.0796 | Val Accuracy 0.7600 Epoch 00420 | Time(s) 0.0335 | Train Loss 0.0434 | Val Accuracy 0.7567 Epoch 00421 | Time(s) 0.0335 | Train Loss 0.0274 | Val Accuracy 0.7600 Epoch 00422 | Time(s) 0.0335 | Train Loss 0.0459 | Val Accuracy 0.7633 Epoch 00423 | Time(s) 0.0335 | Train Loss 0.0315 | Val Accuracy 0.7633 Epoch 00424 | Time(s) 0.0335 | Train Loss 0.0309 | Val Accuracy 0.7633 Epoch 00425 | Time(s) 0.0335 | Train Loss 0.0399 | Val Accuracy 0.7633 Epoch 00426 | Time(s) 0.0335 | Train Loss 0.0855 | Val Accuracy 0.7700 Epoch 00427 | Time(s) 0.0334 | Train Loss 0.0463 | Val Accuracy 0.7733 Epoch 00428 | Time(s) 0.0334 | Train Loss 0.0285 | Val Accuracy 0.7733 Epoch 00429 | Time(s) 0.0334 | Train Loss 0.0452 | Val Accuracy 0.7767 Epoch 00430 | Time(s) 0.0334 | Train Loss 0.0338 | Val Accuracy 0.7800 Epoch 00431 | Time(s) 0.0334 | Train Loss 0.1237 | Val Accuracy 0.7767 Epoch 00432 | Time(s) 0.0334 | Train Loss 0.0688 | Val Accuracy 0.7800 Epoch 00433 | Time(s) 0.0334 | Train Loss 0.0292 | Val Accuracy 0.7833 Epoch 00434 | Time(s) 0.0334 | Train Loss 0.0913 | Val Accuracy 0.7900 Epoch 00435 | Time(s) 0.0334 | Train Loss 0.0336 | Val Accuracy 0.7900 Epoch 00436 | Time(s) 0.0333 | Train Loss 0.0393 | Val Accuracy 0.7967 Epoch 00437 | Time(s) 0.0333 | Train Loss 0.0595 | Val Accuracy 0.8000 Epoch 00438 | Time(s) 0.0333 | Train Loss 0.0331 | Val Accuracy 0.8033 Epoch 00439 | Time(s) 0.0333 | Train Loss 0.0742 | Val Accuracy 0.8033 Epoch 00440 | Time(s) 0.0333 | Train Loss 0.0435 | Val Accuracy 0.8033 Epoch 00441 | Time(s) 0.0334 | Train Loss 0.0136 | Val Accuracy 0.7967 Epoch 00442 | Time(s) 0.0335 | Train Loss 0.0597 | Val Accuracy 0.7867 Epoch 00443 | Time(s) 0.0335 | Train Loss 0.0464 | Val Accuracy 0.7867 Epoch 00444 | Time(s) 0.0335 | Train Loss 0.0227 | Val Accuracy 0.7833 Epoch 00445 | Time(s) 0.0335 | Train Loss 0.0521 | Val Accuracy 0.7767 Epoch 00446 | Time(s) 0.0335 | Train Loss 0.0325 | Val Accuracy 0.7800 Epoch 00447 | Time(s) 0.0335 | Train Loss 0.0876 | Val Accuracy 0.7767 Epoch 00448 | Time(s) 0.0335 | Train Loss 0.0608 | Val Accuracy 0.7767 Epoch 00449 | Time(s) 0.0335 | Train Loss 0.0738 | Val Accuracy 0.7767 Epoch 00450 | Time(s) 0.0335 | Train Loss 0.0419 | Val Accuracy 0.7767 Epoch 00451 | Time(s) 0.0335 | Train Loss 0.0605 | Val Accuracy 0.7767 Epoch 00452 | Time(s) 0.0335 | Train Loss 0.0589 | Val Accuracy 0.7767 Epoch 00453 | Time(s) 0.0335 | Train Loss 0.0240 | Val Accuracy 0.7800 Epoch 00454 | Time(s) 0.0335 | Train Loss 0.0280 | Val Accuracy 0.7767 Epoch 00455 | Time(s) 0.0334 | Train Loss 0.0855 | Val Accuracy 0.7800 Epoch 00456 | Time(s) 0.0334 | Train Loss 0.0893 | Val Accuracy 0.7767 Epoch 00457 | Time(s) 0.0334 | Train Loss 0.0710 | Val Accuracy 0.7700 Epoch 00458 | Time(s) 0.0334 | Train Loss 0.0473 | Val Accuracy 0.7700 Epoch 00459 | Time(s) 0.0334 | Train Loss 0.0330 | Val Accuracy 0.7667 Epoch 00460 | Time(s) 0.0335 | Train Loss 0.0229 | Val Accuracy 0.7667 Epoch 00461 | Time(s) 0.0335 | Train Loss 0.0770 | Val Accuracy 0.7633 Epoch 00462 | Time(s) 0.0336 | Train Loss 0.0744 | Val Accuracy 0.7633 Epoch 00463 | Time(s) 0.0336 | Train Loss 0.0994 | Val Accuracy 0.7667 Epoch 00464 | Time(s) 0.0336 | Train Loss 0.0204 | Val Accuracy 0.7700 Epoch 00465 | Time(s) 0.0336 | Train Loss 0.0381 | Val Accuracy 0.7767 Epoch 00466 | Time(s) 0.0336 | Train Loss 0.0450 | Val Accuracy 0.7733 Epoch 00467 | Time(s) 0.0336 | Train Loss 0.0421 | Val Accuracy 0.7767 Epoch 00468 | Time(s) 0.0336 | Train Loss 0.0372 | Val Accuracy 0.7667 Epoch 00469 | Time(s) 0.0336 | Train Loss 0.0545 | Val Accuracy 0.7667 Epoch 00470 | Time(s) 0.0336 | Train Loss 0.0559 | Val Accuracy 0.7700 Epoch 00471 | Time(s) 0.0336 | Train Loss 0.0704 | Val Accuracy 0.7667 Epoch 00472 | Time(s) 0.0336 | Train Loss 0.1044 | Val Accuracy 0.7733 Epoch 00473 | Time(s) 0.0336 | Train Loss 0.0616 | Val Accuracy 0.7767 Epoch 00474 | Time(s) 0.0336 | Train Loss 0.0263 | Val Accuracy 0.7767 Epoch 00475 | Time(s) 0.0336 | Train Loss 0.0242 | Val Accuracy 0.7733 Epoch 00476 | Time(s) 0.0336 | Train Loss 0.0159 | Val Accuracy 0.7733 Epoch 00477 | Time(s) 0.0336 | Train Loss 0.0958 | Val Accuracy 0.7767 Epoch 00478 | Time(s) 0.0336 | Train Loss 0.0501 | Val Accuracy 0.7733 Epoch 00479 | Time(s) 0.0336 | Train Loss 0.0856 | Val Accuracy 0.7700 Epoch 00480 | Time(s) 0.0336 | Train Loss 0.0278 | Val Accuracy 0.7733 Epoch 00481 | Time(s) 0.0336 | Train Loss 0.0358 | Val Accuracy 0.7700 Epoch 00482 | Time(s) 0.0336 | Train Loss 0.0268 | Val Accuracy 0.7733 Epoch 00483 | Time(s) 0.0336 | Train Loss 0.0680 | Val Accuracy 0.7733 Epoch 00484 | Time(s) 0.0336 | Train Loss 0.0861 | Val Accuracy 0.7700 Epoch 00485 | Time(s) 0.0336 | Train Loss 0.0654 | Val Accuracy 0.7733 Epoch 00486 | Time(s) 0.0336 | Train Loss 0.0529 | Val Accuracy 0.7767 Epoch 00487 | Time(s) 0.0336 | Train Loss 0.0508 | Val Accuracy 0.7767 Epoch 00488 | Time(s) 0.0336 | Train Loss 0.0361 | Val Accuracy 0.7733 Epoch 00489 | Time(s) 0.0336 | Train Loss 0.0627 | Val Accuracy 0.7733 Epoch 00490 | Time(s) 0.0336 | Train Loss 0.0225 | Val Accuracy 0.7733 Epoch 00491 | Time(s) 0.0336 | Train Loss 0.0431 | Val Accuracy 0.7700 Epoch 00492 | Time(s) 0.0337 | Train Loss 0.0213 | Val Accuracy 0.7667 Epoch 00493 | Time(s) 0.0337 | Train Loss 0.0211 | Val Accuracy 0.7633 Epoch 00494 | Time(s) 0.0338 | Train Loss 0.0507 | Val Accuracy 0.7633 Epoch 00495 | Time(s) 0.0338 | Train Loss 0.0869 | Val Accuracy 0.7633 Epoch 00496 | Time(s) 0.0339 | Train Loss 0.0601 | Val Accuracy 0.7667 Epoch 00497 | Time(s) 0.0339 | Train Loss 0.0509 | Val Accuracy 0.7700 Epoch 00498 | Time(s) 0.0339 | Train Loss 0.0534 | Val Accuracy 0.7667 Epoch 00499 | Time(s) 0.0339 | Train Loss 0.0317 | Val Accuracy 0.7667 Epoch 00500 | Time(s) 0.0339 | Train Loss 0.0462 | Val Accuracy 0.7667 Epoch 00501 | Time(s) 0.0339 | Train Loss 0.0431 | Val Accuracy 0.7633 Epoch 00502 | Time(s) 0.0339 | Train Loss 0.0214 | Val Accuracy 0.7633 Epoch 00503 | Time(s) 0.0339 | Train Loss 0.0472 | Val Accuracy 0.7633 Epoch 00504 | Time(s) 0.0339 | Train Loss 0.0716 | Val Accuracy 0.7733 Epoch 00505 | Time(s) 0.0339 | Train Loss 0.0277 | Val Accuracy 0.7800 Epoch 00506 | Time(s) 0.0339 | Train Loss 0.0351 | Val Accuracy 0.7833 Epoch 00507 | Time(s) 0.0339 | Train Loss 0.0133 | Val Accuracy 0.7867 Epoch 00508 | Time(s) 0.0339 | Train Loss 0.0355 | Val Accuracy 0.7900 Epoch 00509 | Time(s) 0.0339 | Train Loss 0.0362 | Val Accuracy 0.7900 Epoch 00510 | Time(s) 0.0339 | Train Loss 0.0599 | Val Accuracy 0.8000 Epoch 00511 | Time(s) 0.0339 | Train Loss 0.0764 | Val Accuracy 0.8000 Epoch 00512 | Time(s) 0.0339 | Train Loss 0.0389 | Val Accuracy 0.8000 Epoch 00513 | Time(s) 0.0340 | Train Loss 0.0137 | Val Accuracy 0.8033 Epoch 00514 | Time(s) 0.0340 | Train Loss 0.0226 | Val Accuracy 0.8000 Epoch 00515 | Time(s) 0.0340 | Train Loss 0.0396 | Val Accuracy 0.7967 Epoch 00516 | Time(s) 0.0340 | Train Loss 0.0218 | Val Accuracy 0.7933 Epoch 00517 | Time(s) 0.0340 | Train Loss 0.0312 | Val Accuracy 0.7933 Epoch 00518 | Time(s) 0.0340 | Train Loss 0.0333 | Val Accuracy 0.8000 Epoch 00519 | Time(s) 0.0340 | Train Loss 0.0575 | Val Accuracy 0.7933 Epoch 00520 | Time(s) 0.0340 | Train Loss 0.0314 | Val Accuracy 0.7933 Epoch 00521 | Time(s) 0.0340 | Train Loss 0.0488 | Val Accuracy 0.7867 Epoch 00522 | Time(s) 0.0341 | Train Loss 0.0547 | Val Accuracy 0.7867 Epoch 00523 | Time(s) 0.0341 | Train Loss 0.0817 | Val Accuracy 0.7767 Epoch 00524 | Time(s) 0.0341 | Train Loss 0.0291 | Val Accuracy 0.7733 Epoch 00525 | Time(s) 0.0341 | Train Loss 0.0181 | Val Accuracy 0.7800 Epoch 00526 | Time(s) 0.0341 | Train Loss 0.0298 | Val Accuracy 0.7800 Epoch 00527 | Time(s) 0.0341 | Train Loss 0.0572 | Val Accuracy 0.7767 Epoch 00528 | Time(s) 0.0340 | Train Loss 0.0508 | Val Accuracy 0.7767 Epoch 00529 | Time(s) 0.0340 | Train Loss 0.0233 | Val Accuracy 0.7733 Epoch 00530 | Time(s) 0.0340 | Train Loss 0.0189 | Val Accuracy 0.7733 Epoch 00531 | Time(s) 0.0340 | Train Loss 0.0465 | Val Accuracy 0.7733 Epoch 00532 | Time(s) 0.0340 | Train Loss 0.0376 | Val Accuracy 0.7733 Epoch 00533 | Time(s) 0.0340 | Train Loss 0.0378 | Val Accuracy 0.7700 Epoch 00534 | Time(s) 0.0340 | Train Loss 0.0338 | Val Accuracy 0.7700 Epoch 00535 | Time(s) 0.0340 | Train Loss 0.0167 | Val Accuracy 0.7667 Epoch 00536 | Time(s) 0.0340 | Train Loss 0.0265 | Val Accuracy 0.7700 Epoch 00537 | Time(s) 0.0340 | Train Loss 0.0350 | Val Accuracy 0.7633 Epoch 00538 | Time(s) 0.0340 | Train Loss 0.0407 | Val Accuracy 0.7633 Epoch 00539 | Time(s) 0.0340 | Train Loss 0.0125 | Val Accuracy 0.7667 Epoch 00540 | Time(s) 0.0340 | Train Loss 0.0733 | Val Accuracy 0.7633 Epoch 00541 | Time(s) 0.0340 | Train Loss 0.0337 | Val Accuracy 0.7667 Epoch 00542 | Time(s) 0.0339 | Train Loss 0.0192 | Val Accuracy 0.7667 Epoch 00543 | Time(s) 0.0339 | Train Loss 0.0341 | Val Accuracy 0.7667 Epoch 00544 | Time(s) 0.0339 | Train Loss 0.0805 | Val Accuracy 0.7700 Epoch 00545 | Time(s) 0.0340 | Train Loss 0.0346 | Val Accuracy 0.7733 Epoch 00546 | Time(s) 0.0340 | Train Loss 0.0261 | Val Accuracy 0.7800 Epoch 00547 | Time(s) 0.0340 | Train Loss 0.0261 | Val Accuracy 0.7800 Epoch 00548 | Time(s) 0.0340 | Train Loss 0.0353 | Val Accuracy 0.7900 Epoch 00549 | Time(s) 0.0340 | Train Loss 0.0210 | Val Accuracy 0.7900 Epoch 00550 | Time(s) 0.0340 | Train Loss 0.0360 | Val Accuracy 0.7933 Epoch 00551 | Time(s) 0.0340 | Train Loss 0.0476 | Val Accuracy 0.7967 Epoch 00552 | Time(s) 0.0339 | Train Loss 0.0283 | Val Accuracy 0.7900 Epoch 00553 | Time(s) 0.0339 | Train Loss 0.0456 | Val Accuracy 0.7933 Epoch 00554 | Time(s) 0.0339 | Train Loss 0.0232 | Val Accuracy 0.7933 Epoch 00555 | Time(s) 0.0339 | Train Loss 0.0397 | Val Accuracy 0.7833 Epoch 00556 | Time(s) 0.0339 | Train Loss 0.0317 | Val Accuracy 0.7933 Epoch 00557 | Time(s) 0.0339 | Train Loss 0.0324 | Val Accuracy 0.7867 Epoch 00558 | Time(s) 0.0339 | Train Loss 0.0930 | Val Accuracy 0.7800 Epoch 00559 | Time(s) 0.0339 | Train Loss 0.0144 | Val Accuracy 0.7767 Epoch 00560 | Time(s) 0.0339 | Train Loss 0.0638 | Val Accuracy 0.7733 Epoch 00561 | Time(s) 0.0339 | Train Loss 0.0327 | Val Accuracy 0.7733 Epoch 00562 | Time(s) 0.0339 | Train Loss 0.0353 | Val Accuracy 0.7733 Epoch 00563 | Time(s) 0.0339 | Train Loss 0.0139 | Val Accuracy 0.7700 Epoch 00564 | Time(s) 0.0339 | Train Loss 0.0261 | Val Accuracy 0.7667 Epoch 00565 | Time(s) 0.0339 | Train Loss 0.0430 | Val Accuracy 0.7667 Epoch 00566 | Time(s) 0.0339 | Train Loss 0.0485 | Val Accuracy 0.7633 Epoch 00567 | Time(s) 0.0339 | Train Loss 0.0812 | Val Accuracy 0.7600 Epoch 00568 | Time(s) 0.0339 | Train Loss 0.0268 | Val Accuracy 0.7633 Epoch 00569 | Time(s) 0.0339 | Train Loss 0.0837 | Val Accuracy 0.7633 Epoch 00570 | Time(s) 0.0339 | Train Loss 0.1128 | Val Accuracy 0.7600 Epoch 00571 | Time(s) 0.0339 | Train Loss 0.0200 | Val Accuracy 0.7600 Epoch 00572 | Time(s) 0.0339 | Train Loss 0.0740 | Val Accuracy 0.7567 Epoch 00573 | Time(s) 0.0339 | Train Loss 0.0085 | Val Accuracy 0.7567 Epoch 00574 | Time(s) 0.0339 | Train Loss 0.0126 | Val Accuracy 0.7567 Epoch 00575 | Time(s) 0.0338 | Train Loss 0.0587 | Val Accuracy 0.7633 Epoch 00576 | Time(s) 0.0338 | Train Loss 0.0635 | Val Accuracy 0.7700 Epoch 00577 | Time(s) 0.0338 | Train Loss 0.0260 | Val Accuracy 0.7700 Epoch 00578 | Time(s) 0.0338 | Train Loss 0.0917 | Val Accuracy 0.7700 Epoch 00579 | Time(s) 0.0338 | Train Loss 0.0433 | Val Accuracy 0.7667 Epoch 00580 | Time(s) 0.0338 | Train Loss 0.0572 | Val Accuracy 0.7667 Epoch 00581 | Time(s) 0.0338 | Train Loss 0.0162 | Val Accuracy 0.7667 Epoch 00582 | Time(s) 0.0338 | Train Loss 0.0307 | Val Accuracy 0.7800 Epoch 00583 | Time(s) 0.0338 | Train Loss 0.0268 | Val Accuracy 0.7800 Epoch 00584 | Time(s) 0.0338 | Train Loss 0.0415 | Val Accuracy 0.7800 Epoch 00585 | Time(s) 0.0338 | Train Loss 0.0260 | Val Accuracy 0.7800 Epoch 00586 | Time(s) 0.0338 | Train Loss 0.0390 | Val Accuracy 0.7800 Epoch 00587 | Time(s) 0.0337 | Train Loss 0.0544 | Val Accuracy 0.7800 Epoch 00588 | Time(s) 0.0338 | Train Loss 0.0237 | Val Accuracy 0.7767 Epoch 00589 | Time(s) 0.0338 | Train Loss 0.0349 | Val Accuracy 0.7767 Epoch 00590 | Time(s) 0.0338 | Train Loss 0.0094 | Val Accuracy 0.7733 Epoch 00591 | Time(s) 0.0338 | Train Loss 0.0127 | Val Accuracy 0.7733 Epoch 00592 | Time(s) 0.0338 | Train Loss 0.0377 | Val Accuracy 0.7700 Epoch 00593 | Time(s) 0.0338 | Train Loss 0.0613 | Val Accuracy 0.7767 Epoch 00594 | Time(s) 0.0338 | Train Loss 0.0508 | Val Accuracy 0.7833 Epoch 00595 | Time(s) 0.0338 | Train Loss 0.0421 | Val Accuracy 0.7800 Epoch 00596 | Time(s) 0.0338 | Train Loss 0.0409 | Val Accuracy 0.7767 Epoch 00597 | Time(s) 0.0337 | Train Loss 0.0126 | Val Accuracy 0.7800 Epoch 00598 | Time(s) 0.0337 | Train Loss 0.0489 | Val Accuracy 0.7833 Epoch 00599 | Time(s) 0.0337 | Train Loss 0.0238 | Val Accuracy 0.7867 Epoch 00600 | Time(s) 0.0337 | Train Loss 0.0429 | Val Accuracy 0.7867 Epoch 00601 | Time(s) 0.0337 | Train Loss 0.0262 | Val Accuracy 0.7867 Epoch 00602 | Time(s) 0.0337 | Train Loss 0.0146 | Val Accuracy 0.7900 Epoch 00603 | Time(s) 0.0337 | Train Loss 0.0431 | Val Accuracy 0.7900 Epoch 00604 | Time(s) 0.0337 | Train Loss 0.0254 | Val Accuracy 0.7967 Epoch 00605 | Time(s) 0.0337 | Train Loss 0.0893 | Val Accuracy 0.7933 Epoch 00606 | Time(s) 0.0337 | Train Loss 0.0055 | Val Accuracy 0.8000 Epoch 00607 | Time(s) 0.0337 | Train Loss 0.0687 | Val Accuracy 0.8000 Epoch 00608 | Time(s) 0.0337 | Train Loss 0.0683 | Val Accuracy 0.8067 Epoch 00609 | Time(s) 0.0337 | Train Loss 0.0226 | Val Accuracy 0.8000 Epoch 00610 | Time(s) 0.0337 | Train Loss 0.0396 | Val Accuracy 0.8000 Epoch 00611 | Time(s) 0.0337 | Train Loss 0.0451 | Val Accuracy 0.7967 Epoch 00612 | Time(s) 0.0337 | Train Loss 0.0119 | Val Accuracy 0.7933 Epoch 00613 | Time(s) 0.0337 | Train Loss 0.0399 | Val Accuracy 0.7933 Epoch 00614 | Time(s) 0.0337 | Train Loss 0.0349 | Val Accuracy 0.7933 Epoch 00615 | Time(s) 0.0337 | Train Loss 0.0446 | Val Accuracy 0.7867 Epoch 00616 | Time(s) 0.0337 | Train Loss 0.0532 | Val Accuracy 0.7867 Epoch 00617 | Time(s) 0.0337 | Train Loss 0.0339 | Val Accuracy 0.7867 Epoch 00618 | Time(s) 0.0337 | Train Loss 0.0627 | Val Accuracy 0.7867 Epoch 00619 | Time(s) 0.0337 | Train Loss 0.0201 | Val Accuracy 0.7833 Epoch 00620 | Time(s) 0.0337 | Train Loss 0.0456 | Val Accuracy 0.7867 Epoch 00621 | Time(s) 0.0337 | Train Loss 0.0473 | Val Accuracy 0.7867 Epoch 00622 | Time(s) 0.0337 | Train Loss 0.0164 | Val Accuracy 0.7900 Epoch 00623 | Time(s) 0.0337 | Train Loss 0.0317 | Val Accuracy 0.7933 Epoch 00624 | Time(s) 0.0337 | Train Loss 0.0619 | Val Accuracy 0.7933 Epoch 00625 | Time(s) 0.0337 | Train Loss 0.0376 | Val Accuracy 0.7967 Epoch 00626 | Time(s) 0.0337 | Train Loss 0.0669 | Val Accuracy 0.7967 Epoch 00627 | Time(s) 0.0337 | Train Loss 0.0129 | Val Accuracy 0.8000 Epoch 00628 | Time(s) 0.0337 | Train Loss 0.0250 | Val Accuracy 0.8067 Epoch 00629 | Time(s) 0.0337 | Train Loss 0.0644 | Val Accuracy 0.8067 Epoch 00630 | Time(s) 0.0337 | Train Loss 0.0836 | Val Accuracy 0.8133 Epoch 00631 | Time(s) 0.0337 | Train Loss 0.0651 | Val Accuracy 0.8233 Epoch 00632 | Time(s) 0.0337 | Train Loss 0.0379 | Val Accuracy 0.8167 Epoch 00633 | Time(s) 0.0337 | Train Loss 0.0946 | Val Accuracy 0.8167 Epoch 00634 | Time(s) 0.0337 | Train Loss 0.0760 | Val Accuracy 0.8167 Epoch 00635 | Time(s) 0.0337 | Train Loss 0.0358 | Val Accuracy 0.8133 Epoch 00636 | Time(s) 0.0336 | Train Loss 0.0401 | Val Accuracy 0.8033 Epoch 00637 | Time(s) 0.0337 | Train Loss 0.0478 | Val Accuracy 0.7967 Epoch 00638 | Time(s) 0.0336 | Train Loss 0.0265 | Val Accuracy 0.8000 Epoch 00639 | Time(s) 0.0336 | Train Loss 0.0244 | Val Accuracy 0.7967 Epoch 00640 | Time(s) 0.0336 | Train Loss 0.0276 | Val Accuracy 0.7967 Epoch 00641 | Time(s) 0.0336 | Train Loss 0.0194 | Val Accuracy 0.7967 Epoch 00642 | Time(s) 0.0336 | Train Loss 0.0164 | Val Accuracy 0.8000 Epoch 00643 | Time(s) 0.0336 | Train Loss 0.0533 | Val Accuracy 0.8000 Epoch 00644 | Time(s) 0.0336 | Train Loss 0.0274 | Val Accuracy 0.7967 Epoch 00645 | Time(s) 0.0336 | Train Loss 0.0592 | Val Accuracy 0.7967 Epoch 00646 | Time(s) 0.0336 | Train Loss 0.0329 | Val Accuracy 0.8033 Epoch 00647 | Time(s) 0.0336 | Train Loss 0.0271 | Val Accuracy 0.7967 Epoch 00648 | Time(s) 0.0336 | Train Loss 0.0552 | Val Accuracy 0.7967 Epoch 00649 | Time(s) 0.0336 | Train Loss 0.0220 | Val Accuracy 0.8000 Epoch 00650 | Time(s) 0.0336 | Train Loss 0.0911 | Val Accuracy 0.7933 Epoch 00651 | Time(s) 0.0336 | Train Loss 0.0456 | Val Accuracy 0.8067 Epoch 00652 | Time(s) 0.0336 | Train Loss 0.1064 | Val Accuracy 0.8133 Epoch 00653 | Time(s) 0.0336 | Train Loss 0.0344 | Val Accuracy 0.8100 Epoch 00654 | Time(s) 0.0336 | Train Loss 0.1392 | Val Accuracy 0.8100 Epoch 00655 | Time(s) 0.0336 | Train Loss 0.0951 | Val Accuracy 0.8133 Epoch 00656 | Time(s) 0.0336 | Train Loss 0.0133 | Val Accuracy 0.8133 Epoch 00657 | Time(s) 0.0336 | Train Loss 0.0794 | Val Accuracy 0.8100 Epoch 00658 | Time(s) 0.0336 | Train Loss 0.0833 | Val Accuracy 0.8033 Epoch 00659 | Time(s) 0.0336 | Train Loss 0.0949 | Val Accuracy 0.7933 Epoch 00660 | Time(s) 0.0336 | Train Loss 0.0423 | Val Accuracy 0.7900 Epoch 00661 | Time(s) 0.0336 | Train Loss 0.0320 | Val Accuracy 0.7933 Epoch 00662 | Time(s) 0.0336 | Train Loss 0.0408 | Val Accuracy 0.7833 Epoch 00663 | Time(s) 0.0336 | Train Loss 0.0329 | Val Accuracy 0.7767 Epoch 00664 | Time(s) 0.0336 | Train Loss 0.0801 | Val Accuracy 0.7767 Epoch 00665 | Time(s) 0.0336 | Train Loss 0.1009 | Val Accuracy 0.7733 Epoch 00666 | Time(s) 0.0335 | Train Loss 0.0382 | Val Accuracy 0.7733 Epoch 00667 | Time(s) 0.0335 | Train Loss 0.0486 | Val Accuracy 0.7767 Epoch 00668 | Time(s) 0.0336 | Train Loss 0.0221 | Val Accuracy 0.7767 Epoch 00669 | Time(s) 0.0335 | Train Loss 0.0348 | Val Accuracy 0.7867 Epoch 00670 | Time(s) 0.0335 | Train Loss 0.1380 | Val Accuracy 0.7867 Epoch 00671 | Time(s) 0.0335 | Train Loss 0.0734 | Val Accuracy 0.7767 Epoch 00672 | Time(s) 0.0336 | Train Loss 0.0325 | Val Accuracy 0.7900 Epoch 00673 | Time(s) 0.0336 | Train Loss 0.0328 | Val Accuracy 0.7900 Epoch 00674 | Time(s) 0.0336 | Train Loss 0.0391 | Val Accuracy 0.7900 Epoch 00675 | Time(s) 0.0335 | Train Loss 0.0612 | Val Accuracy 0.7833 Epoch 00676 | Time(s) 0.0335 | Train Loss 0.0572 | Val Accuracy 0.7867 Epoch 00677 | Time(s) 0.0335 | Train Loss 0.0750 | Val Accuracy 0.7833 Epoch 00678 | Time(s) 0.0335 | Train Loss 0.0489 | Val Accuracy 0.7933 Epoch 00679 | Time(s) 0.0335 | Train Loss 0.0563 | Val Accuracy 0.7967 Epoch 00680 | Time(s) 0.0335 | Train Loss 0.0832 | Val Accuracy 0.7900 Epoch 00681 | Time(s) 0.0335 | Train Loss 0.0813 | Val Accuracy 0.7933 Epoch 00682 | Time(s) 0.0335 | Train Loss 0.0542 | Val Accuracy 0.7867 Epoch 00683 | Time(s) 0.0335 | Train Loss 0.0947 | Val Accuracy 0.7900 Epoch 00684 | Time(s) 0.0335 | Train Loss 0.0530 | Val Accuracy 0.7933 Epoch 00685 | Time(s) 0.0335 | Train Loss 0.0466 | Val Accuracy 0.7933 Epoch 00686 | Time(s) 0.0335 | Train Loss 0.0188 | Val Accuracy 0.7867 Epoch 00687 | Time(s) 0.0335 | Train Loss 0.0674 | Val Accuracy 0.7733 Epoch 00688 | Time(s) 0.0335 | Train Loss 0.0556 | Val Accuracy 0.7900 Epoch 00689 | Time(s) 0.0335 | Train Loss 0.0606 | Val Accuracy 0.7900 Epoch 00690 | Time(s) 0.0335 | Train Loss 0.0841 | Val Accuracy 0.7933 Epoch 00691 | Time(s) 0.0335 | Train Loss 0.0400 | Val Accuracy 0.8000 Epoch 00692 | Time(s) 0.0335 | Train Loss 0.0517 | Val Accuracy 0.7933 Epoch 00693 | Time(s) 0.0335 | Train Loss 0.0322 | Val Accuracy 0.7933 Epoch 00694 | Time(s) 0.0335 | Train Loss 0.0627 | Val Accuracy 0.7933 Epoch 00695 | Time(s) 0.0335 | Train Loss 0.0664 | Val Accuracy 0.7833 Epoch 00696 | Time(s) 0.0334 | Train Loss 0.0164 | Val Accuracy 0.7867 Epoch 00697 | Time(s) 0.0334 | Train Loss 0.0463 | Val Accuracy 0.7833 Epoch 00698 | Time(s) 0.0334 | Train Loss 0.0149 | Val Accuracy 0.7867 Epoch 00699 | Time(s) 0.0334 | Train Loss 0.0244 | Val Accuracy 0.7833 Epoch 00700 | Time(s) 0.0334 | Train Loss 0.0467 | Val Accuracy 0.7800 Epoch 00701 | Time(s) 0.0334 | Train Loss 0.0244 | Val Accuracy 0.7800 Epoch 00702 | Time(s) 0.0334 | Train Loss 0.1103 | Val Accuracy 0.7800 Epoch 00703 | Time(s) 0.0334 | Train Loss 0.0605 | Val Accuracy 0.7833 Epoch 00704 | Time(s) 0.0334 | Train Loss 0.1249 | Val Accuracy 0.7967 Epoch 00705 | Time(s) 0.0334 | Train Loss 0.0273 | Val Accuracy 0.8100 Epoch 00706 | Time(s) 0.0334 | Train Loss 0.0342 | Val Accuracy 0.8067 Epoch 00707 | Time(s) 0.0334 | Train Loss 0.0292 | Val Accuracy 0.8033 Epoch 00708 | Time(s) 0.0334 | Train Loss 0.0087 | Val Accuracy 0.8133 Epoch 00709 | Time(s) 0.0334 | Train Loss 0.0764 | Val Accuracy 0.8133 Epoch 00710 | Time(s) 0.0333 | Train Loss 0.0192 | Val Accuracy 0.8100 Epoch 00711 | Time(s) 0.0333 | Train Loss 0.1317 | Val Accuracy 0.8067 Epoch 00712 | Time(s) 0.0333 | Train Loss 0.0171 | Val Accuracy 0.8033 Epoch 00713 | Time(s) 0.0333 | Train Loss 0.0455 | Val Accuracy 0.8033 Epoch 00714 | Time(s) 0.0333 | Train Loss 0.0884 | Val Accuracy 0.8067 Epoch 00715 | Time(s) 0.0333 | Train Loss 0.0520 | Val Accuracy 0.8033 Epoch 00716 | Time(s) 0.0333 | Train Loss 0.0291 | Val Accuracy 0.8033 Epoch 00717 | Time(s) 0.0333 | Train Loss 0.0340 | Val Accuracy 0.8000 Epoch 00718 | Time(s) 0.0333 | Train Loss 0.0555 | Val Accuracy 0.7967 Epoch 00719 | Time(s) 0.0333 | Train Loss 0.0406 | Val Accuracy 0.7967 Epoch 00720 | Time(s) 0.0333 | Train Loss 0.0819 | Val Accuracy 0.8000 Epoch 00721 | Time(s) 0.0333 | Train Loss 0.0233 | Val Accuracy 0.8000 Epoch 00722 | Time(s) 0.0333 | Train Loss 0.0852 | Val Accuracy 0.7967 Epoch 00723 | Time(s) 0.0333 | Train Loss 0.0244 | Val Accuracy 0.7967 Epoch 00724 | Time(s) 0.0333 | Train Loss 0.0835 | Val Accuracy 0.7967 Epoch 00725 | Time(s) 0.0333 | Train Loss 0.0754 | Val Accuracy 0.7967 Epoch 00726 | Time(s) 0.0333 | Train Loss 0.0975 | Val Accuracy 0.8033 Epoch 00727 | Time(s) 0.0333 | Train Loss 0.0508 | Val Accuracy 0.8100 Epoch 00728 | Time(s) 0.0332 | Train Loss 0.0444 | Val Accuracy 0.8033 Epoch 00729 | Time(s) 0.0332 | Train Loss 0.0350 | Val Accuracy 0.8000 Epoch 00730 | Time(s) 0.0332 | Train Loss 0.0176 | Val Accuracy 0.8067 Epoch 00731 | Time(s) 0.0332 | Train Loss 0.0236 | Val Accuracy 0.8033 Epoch 00732 | Time(s) 0.0332 | Train Loss 0.0855 | Val Accuracy 0.8033 Epoch 00733 | Time(s) 0.0332 | Train Loss 0.0854 | Val Accuracy 0.8067 Epoch 00734 | Time(s) 0.0332 | Train Loss 0.0086 | Val Accuracy 0.8033 Epoch 00735 | Time(s) 0.0332 | Train Loss 0.0193 | Val Accuracy 0.8133 Epoch 00736 | Time(s) 0.0332 | Train Loss 0.0129 | Val Accuracy 0.8100 Epoch 00737 | Time(s) 0.0332 | Train Loss 0.0406 | Val Accuracy 0.8133 Epoch 00738 | Time(s) 0.0332 | Train Loss 0.0507 | Val Accuracy 0.8100 Epoch 00739 | Time(s) 0.0332 | Train Loss 0.0123 | Val Accuracy 0.8133 Epoch 00740 | Time(s) 0.0332 | Train Loss 0.0435 | Val Accuracy 0.8167 Epoch 00741 | Time(s) 0.0332 | Train Loss 0.0401 | Val Accuracy 0.8100 Epoch 00742 | Time(s) 0.0332 | Train Loss 0.0701 | Val Accuracy 0.8133 Epoch 00743 | Time(s) 0.0332 | Train Loss 0.0470 | Val Accuracy 0.8033 Epoch 00744 | Time(s) 0.0332 | Train Loss 0.0331 | Val Accuracy 0.8000 Epoch 00745 | Time(s) 0.0332 | Train Loss 0.0319 | Val Accuracy 0.7967 Epoch 00746 | Time(s) 0.0332 | Train Loss 0.0333 | Val Accuracy 0.7900 Epoch 00747 | Time(s) 0.0332 | Train Loss 0.0602 | Val Accuracy 0.7967 Epoch 00748 | Time(s) 0.0331 | Train Loss 0.0548 | Val Accuracy 0.8000 Epoch 00749 | Time(s) 0.0331 | Train Loss 0.0369 | Val Accuracy 0.7933 Epoch 00750 | Time(s) 0.0331 | Train Loss 0.0253 | Val Accuracy 0.7900 Epoch 00751 | Time(s) 0.0331 | Train Loss 0.0437 | Val Accuracy 0.7867 Epoch 00752 | Time(s) 0.0331 | Train Loss 0.0300 | Val Accuracy 0.7900 Epoch 00753 | Time(s) 0.0331 | Train Loss 0.0542 | Val Accuracy 0.7933 Epoch 00754 | Time(s) 0.0331 | Train Loss 0.0183 | Val Accuracy 0.7933 Epoch 00755 | Time(s) 0.0331 | Train Loss 0.1230 | Val Accuracy 0.7967 Epoch 00756 | Time(s) 0.0331 | Train Loss 0.0332 | Val Accuracy 0.7967 Epoch 00757 | Time(s) 0.0331 | Train Loss 0.0601 | Val Accuracy 0.7933 Epoch 00758 | Time(s) 0.0331 | Train Loss 0.0490 | Val Accuracy 0.7933 Epoch 00759 | Time(s) 0.0331 | Train Loss 0.0570 | Val Accuracy 0.7967 Epoch 00760 | Time(s) 0.0331 | Train Loss 0.0461 | Val Accuracy 0.8033 Epoch 00761 | Time(s) 0.0331 | Train Loss 0.0643 | Val Accuracy 0.8067 Epoch 00762 | Time(s) 0.0331 | Train Loss 0.0379 | Val Accuracy 0.8067 Epoch 00763 | Time(s) 0.0331 | Train Loss 0.0195 | Val Accuracy 0.8067 Epoch 00764 | Time(s) 0.0331 | Train Loss 0.0393 | Val Accuracy 0.8033 Epoch 00765 | Time(s) 0.0331 | Train Loss 0.0462 | Val Accuracy 0.8067 Epoch 00766 | Time(s) 0.0331 | Train Loss 0.0692 | Val Accuracy 0.8067 Epoch 00767 | Time(s) 0.0331 | Train Loss 0.0739 | Val Accuracy 0.8067 Epoch 00768 | Time(s) 0.0332 | Train Loss 0.0718 | Val Accuracy 0.8100 Epoch 00769 | Time(s) 0.0332 | Train Loss 0.0551 | Val Accuracy 0.8100 Epoch 00770 | Time(s) 0.0332 | Train Loss 0.0597 | Val Accuracy 0.8067 Epoch 00771 | Time(s) 0.0332 | Train Loss 0.1067 | Val Accuracy 0.8033 Epoch 00772 | Time(s) 0.0333 | Train Loss 0.0371 | Val Accuracy 0.8067 Epoch 00773 | Time(s) 0.0333 | Train Loss 0.0329 | Val Accuracy 0.8067 Epoch 00774 | Time(s) 0.0333 | Train Loss 0.0524 | Val Accuracy 0.8067 Epoch 00775 | Time(s) 0.0333 | Train Loss 0.1609 | Val Accuracy 0.8133 Epoch 00776 | Time(s) 0.0333 | Train Loss 0.0572 | Val Accuracy 0.8033 Epoch 00777 | Time(s) 0.0333 | Train Loss 0.0525 | Val Accuracy 0.8000 Epoch 00778 | Time(s) 0.0333 | Train Loss 0.0134 | Val Accuracy 0.7967 Epoch 00779 | Time(s) 0.0333 | Train Loss 0.0925 | Val Accuracy 0.8000 Epoch 00780 | Time(s) 0.0333 | Train Loss 0.0207 | Val Accuracy 0.7967 Epoch 00781 | Time(s) 0.0333 | Train Loss 0.0559 | Val Accuracy 0.7933 Epoch 00782 | Time(s) 0.0333 | Train Loss 0.0198 | Val Accuracy 0.7933 Epoch 00783 | Time(s) 0.0333 | Train Loss 0.0595 | Val Accuracy 0.7933 Epoch 00784 | Time(s) 0.0333 | Train Loss 0.0548 | Val Accuracy 0.7933 Epoch 00785 | Time(s) 0.0333 | Train Loss 0.1028 | Val Accuracy 0.7933 Epoch 00786 | Time(s) 0.0333 | Train Loss 0.0180 | Val Accuracy 0.7900 Epoch 00787 | Time(s) 0.0333 | Train Loss 0.0411 | Val Accuracy 0.7933 Epoch 00788 | Time(s) 0.0333 | Train Loss 0.0793 | Val Accuracy 0.7867 Epoch 00789 | Time(s) 0.0333 | Train Loss 0.0313 | Val Accuracy 0.7900 Epoch 00790 | Time(s) 0.0333 | Train Loss 0.0648 | Val Accuracy 0.7967 Epoch 00791 | Time(s) 0.0333 | Train Loss 0.0160 | Val Accuracy 0.7967 Epoch 00792 | Time(s) 0.0332 | Train Loss 0.0187 | Val Accuracy 0.7967 Epoch 00793 | Time(s) 0.0332 | Train Loss 0.0166 | Val Accuracy 0.7933 Epoch 00794 | Time(s) 0.0333 | Train Loss 0.0272 | Val Accuracy 0.7967 Epoch 00795 | Time(s) 0.0333 | Train Loss 0.0811 | Val Accuracy 0.7967 Epoch 00796 | Time(s) 0.0333 | Train Loss 0.0280 | Val Accuracy 0.8033 Epoch 00797 | Time(s) 0.0333 | Train Loss 0.0981 | Val Accuracy 0.8033 Epoch 00798 | Time(s) 0.0333 | Train Loss 0.1055 | Val Accuracy 0.8067 Epoch 00799 | Time(s) 0.0333 | Train Loss 0.0127 | Val Accuracy 0.8100 Epoch 00800 | Time(s) 0.0333 | Train Loss 0.0202 | Val Accuracy 0.8067 Epoch 00801 | Time(s) 0.0333 | Train Loss 0.0530 | Val Accuracy 0.8100 Epoch 00802 | Time(s) 0.0333 | Train Loss 0.0742 | Val Accuracy 0.8067 Epoch 00803 | Time(s) 0.0333 | Train Loss 0.0355 | Val Accuracy 0.8067 Epoch 00804 | Time(s) 0.0333 | Train Loss 0.0350 | Val Accuracy 0.8033 Epoch 00805 | Time(s) 0.0333 | Train Loss 0.1308 | Val Accuracy 0.8067 Epoch 00806 | Time(s) 0.0333 | Train Loss 0.0631 | Val Accuracy 0.8033 Epoch 00807 | Time(s) 0.0333 | Train Loss 0.0630 | Val Accuracy 0.8100 Epoch 00808 | Time(s) 0.0333 | Train Loss 0.0155 | Val Accuracy 0.8100 Epoch 00809 | Time(s) 0.0333 | Train Loss 0.0225 | Val Accuracy 0.8000 Epoch 00810 | Time(s) 0.0333 | Train Loss 0.0311 | Val Accuracy 0.8033 Epoch 00811 | Time(s) 0.0333 | Train Loss 0.0445 | Val Accuracy 0.8067 Epoch 00812 | Time(s) 0.0332 | Train Loss 0.0738 | Val Accuracy 0.8033 Epoch 00813 | Time(s) 0.0332 | Train Loss 0.0403 | Val Accuracy 0.8033 Epoch 00814 | Time(s) 0.0332 | Train Loss 0.1174 | Val Accuracy 0.7967 Epoch 00815 | Time(s) 0.0332 | Train Loss 0.0433 | Val Accuracy 0.7900 Epoch 00816 | Time(s) 0.0332 | Train Loss 0.0512 | Val Accuracy 0.7967 Epoch 00817 | Time(s) 0.0332 | Train Loss 0.1127 | Val Accuracy 0.7900 Epoch 00818 | Time(s) 0.0332 | Train Loss 0.0273 | Val Accuracy 0.7733 Epoch 00819 | Time(s) 0.0332 | Train Loss 0.0409 | Val Accuracy 0.7633 Epoch 00820 | Time(s) 0.0332 | Train Loss 0.0535 | Val Accuracy 0.7633 Epoch 00821 | Time(s) 0.0332 | Train Loss 0.0431 | Val Accuracy 0.7633 Epoch 00822 | Time(s) 0.0332 | Train Loss 0.0629 | Val Accuracy 0.7667 Epoch 00823 | Time(s) 0.0332 | Train Loss 0.0616 | Val Accuracy 0.7633 Epoch 00824 | Time(s) 0.0332 | Train Loss 0.1338 | Val Accuracy 0.7733 Epoch 00825 | Time(s) 0.0332 | Train Loss 0.0841 | Val Accuracy 0.7733 Epoch 00826 | Time(s) 0.0332 | Train Loss 0.0672 | Val Accuracy 0.7800 Epoch 00827 | Time(s) 0.0332 | Train Loss 0.0555 | Val Accuracy 0.7867 Epoch 00828 | Time(s) 0.0332 | Train Loss 0.0431 | Val Accuracy 0.7833 Epoch 00829 | Time(s) 0.0333 | Train Loss 0.0861 | Val Accuracy 0.7967 Epoch 00830 | Time(s) 0.0333 | Train Loss 0.0368 | Val Accuracy 0.8000 Epoch 00831 | Time(s) 0.0333 | Train Loss 0.0564 | Val Accuracy 0.8100 Epoch 00832 | Time(s) 0.0333 | Train Loss 0.0903 | Val Accuracy 0.8167 Epoch 00833 | Time(s) 0.0333 | Train Loss 0.0632 | Val Accuracy 0.8067 Epoch 00834 | Time(s) 0.0332 | Train Loss 0.0521 | Val Accuracy 0.8067 Epoch 00835 | Time(s) 0.0332 | Train Loss 0.0271 | Val Accuracy 0.8033 Epoch 00836 | Time(s) 0.0332 | Train Loss 0.0489 | Val Accuracy 0.8000 Epoch 00837 | Time(s) 0.0332 | Train Loss 0.0656 | Val Accuracy 0.7967 Epoch 00838 | Time(s) 0.0332 | Train Loss 0.0527 | Val Accuracy 0.7933 Epoch 00839 | Time(s) 0.0332 | Train Loss 0.0949 | Val Accuracy 0.7967 Epoch 00840 | Time(s) 0.0332 | Train Loss 0.0400 | Val Accuracy 0.8033 Epoch 00841 | Time(s) 0.0332 | Train Loss 0.0775 | Val Accuracy 0.8100 Epoch 00842 | Time(s) 0.0332 | Train Loss 0.0799 | Val Accuracy 0.8133 Epoch 00843 | Time(s) 0.0332 | Train Loss 0.0891 | Val Accuracy 0.8267 Epoch 00844 | Time(s) 0.0332 | Train Loss 0.0262 | Val Accuracy 0.8300 Epoch 00845 | Time(s) 0.0332 | Train Loss 0.0425 | Val Accuracy 0.8233 Epoch 00846 | Time(s) 0.0332 | Train Loss 0.0561 | Val Accuracy 0.8267 Epoch 00847 | Time(s) 0.0332 | Train Loss 0.0552 | Val Accuracy 0.8200 Epoch 00848 | Time(s) 0.0332 | Train Loss 0.0704 | Val Accuracy 0.8100 Epoch 00849 | Time(s) 0.0332 | Train Loss 0.0677 | Val Accuracy 0.8000 Epoch 00850 | Time(s) 0.0332 | Train Loss 0.0578 | Val Accuracy 0.7767 Epoch 00851 | Time(s) 0.0332 | Train Loss 0.0109 | Val Accuracy 0.7733 Epoch 00852 | Time(s) 0.0331 | Train Loss 0.0568 | Val Accuracy 0.7733 Epoch 00853 | Time(s) 0.0331 | Train Loss 0.1259 | Val Accuracy 0.7700 Epoch 00854 | Time(s) 0.0331 | Train Loss 0.0673 | Val Accuracy 0.7900 Epoch 00855 | Time(s) 0.0331 | Train Loss 0.0978 | Val Accuracy 0.7900 Epoch 00856 | Time(s) 0.0331 | Train Loss 0.0208 | Val Accuracy 0.7900 Epoch 00857 | Time(s) 0.0331 | Train Loss 0.1071 | Val Accuracy 0.7933 Epoch 00858 | Time(s) 0.0331 | Train Loss 0.0286 | Val Accuracy 0.7967 Epoch 00859 | Time(s) 0.0331 | Train Loss 0.0760 | Val Accuracy 0.8033 Epoch 00860 | Time(s) 0.0331 | Train Loss 0.0523 | Val Accuracy 0.8000 Epoch 00861 | Time(s) 0.0331 | Train Loss 0.1413 | Val Accuracy 0.8000 Epoch 00862 | Time(s) 0.0331 | Train Loss 0.0529 | Val Accuracy 0.8033 Epoch 00863 | Time(s) 0.0331 | Train Loss 0.1436 | Val Accuracy 0.8133 Epoch 00864 | Time(s) 0.0331 | Train Loss 0.0691 | Val Accuracy 0.8067 Epoch 00865 | Time(s) 0.0331 | Train Loss 0.0827 | Val Accuracy 0.8033 Epoch 00866 | Time(s) 0.0331 | Train Loss 0.0396 | Val Accuracy 0.8033 Epoch 00867 | Time(s) 0.0331 | Train Loss 0.0139 | Val Accuracy 0.8133 Epoch 00868 | Time(s) 0.0331 | Train Loss 0.0209 | Val Accuracy 0.8133 Epoch 00869 | Time(s) 0.0331 | Train Loss 0.0854 | Val Accuracy 0.8033 Epoch 00870 | Time(s) 0.0331 | Train Loss 0.0320 | Val Accuracy 0.8000 Epoch 00871 | Time(s) 0.0331 | Train Loss 0.0746 | Val Accuracy 0.7967 Epoch 00872 | Time(s) 0.0331 | Train Loss 0.0783 | Val Accuracy 0.7833 Epoch 00873 | Time(s) 0.0330 | Train Loss 0.1305 | Val Accuracy 0.7900 Epoch 00874 | Time(s) 0.0330 | Train Loss 0.1298 | Val Accuracy 0.7867 Epoch 00875 | Time(s) 0.0330 | Train Loss 0.0526 | Val Accuracy 0.7933 Epoch 00876 | Time(s) 0.0330 | Train Loss 0.0504 | Val Accuracy 0.8000 Epoch 00877 | Time(s) 0.0330 | Train Loss 0.0215 | Val Accuracy 0.8000 Epoch 00878 | Time(s) 0.0330 | Train Loss 0.0531 | Val Accuracy 0.8067 Epoch 00879 | Time(s) 0.0330 | Train Loss 0.0772 | Val Accuracy 0.8100 Epoch 00880 | Time(s) 0.0330 | Train Loss 0.0692 | Val Accuracy 0.8100 Epoch 00881 | Time(s) 0.0330 | Train Loss 0.1271 | Val Accuracy 0.8133 Epoch 00882 | Time(s) 0.0330 | Train Loss 0.0912 | Val Accuracy 0.8067 Epoch 00883 | Time(s) 0.0330 | Train Loss 0.0572 | Val Accuracy 0.8167 Epoch 00884 | Time(s) 0.0330 | Train Loss 0.1417 | Val Accuracy 0.8233 Epoch 00885 | Time(s) 0.0330 | Train Loss 0.0564 | Val Accuracy 0.8200 Epoch 00886 | Time(s) 0.0330 | Train Loss 0.0750 | Val Accuracy 0.8167 Epoch 00887 | Time(s) 0.0330 | Train Loss 0.0535 | Val Accuracy 0.8133 Epoch 00888 | Time(s) 0.0329 | Train Loss 0.0390 | Val Accuracy 0.8100 Epoch 00889 | Time(s) 0.0329 | Train Loss 0.0275 | Val Accuracy 0.8033 Epoch 00890 | Time(s) 0.0329 | Train Loss 0.0405 | Val Accuracy 0.8000 Epoch 00891 | Time(s) 0.0329 | Train Loss 0.0410 | Val Accuracy 0.7933 Epoch 00892 | Time(s) 0.0329 | Train Loss 0.0248 | Val Accuracy 0.7933 Epoch 00893 | Time(s) 0.0329 | Train Loss 0.0314 | Val Accuracy 0.7867 Epoch 00894 | Time(s) 0.0329 | Train Loss 0.0936 | Val Accuracy 0.7900 Epoch 00895 | Time(s) 0.0329 | Train Loss 0.0586 | Val Accuracy 0.7933 Epoch 00896 | Time(s) 0.0329 | Train Loss 0.0713 | Val Accuracy 0.7933 Epoch 00897 | Time(s) 0.0329 | Train Loss 0.0478 | Val Accuracy 0.8033 Epoch 00898 | Time(s) 0.0329 | Train Loss 0.0698 | Val Accuracy 0.8000 Epoch 00899 | Time(s) 0.0329 | Train Loss 0.0337 | Val Accuracy 0.8033 Epoch 00900 | Time(s) 0.0329 | Train Loss 0.0556 | Val Accuracy 0.8000 Epoch 00901 | Time(s) 0.0329 | Train Loss 0.1321 | Val Accuracy 0.7967 Epoch 00902 | Time(s) 0.0329 | Train Loss 0.0796 | Val Accuracy 0.7900 Epoch 00903 | Time(s) 0.0329 | Train Loss 0.0231 | Val Accuracy 0.7833 Epoch 00904 | Time(s) 0.0328 | Train Loss 0.0341 | Val Accuracy 0.7867 Epoch 00905 | Time(s) 0.0328 | Train Loss 0.0513 | Val Accuracy 0.7900 Epoch 00906 | Time(s) 0.0328 | Train Loss 0.0620 | Val Accuracy 0.7933 Epoch 00907 | Time(s) 0.0328 | Train Loss 0.0415 | Val Accuracy 0.8000 Epoch 00908 | Time(s) 0.0328 | Train Loss 0.0573 | Val Accuracy 0.7933 Epoch 00909 | Time(s) 0.0328 | Train Loss 0.0913 | Val Accuracy 0.8067 Epoch 00910 | Time(s) 0.0328 | Train Loss 0.0276 | Val Accuracy 0.8100 Epoch 00911 | Time(s) 0.0328 | Train Loss 0.1154 | Val Accuracy 0.8100 Epoch 00912 | Time(s) 0.0328 | Train Loss 0.0152 | Val Accuracy 0.8067 Epoch 00913 | Time(s) 0.0328 | Train Loss 0.0530 | Val Accuracy 0.8067 Epoch 00914 | Time(s) 0.0328 | Train Loss 0.0695 | Val Accuracy 0.8133 Epoch 00915 | Time(s) 0.0328 | Train Loss 0.0862 | Val Accuracy 0.8133 Epoch 00916 | Time(s) 0.0328 | Train Loss 0.0764 | Val Accuracy 0.8167 Epoch 00917 | Time(s) 0.0328 | Train Loss 0.0345 | Val Accuracy 0.8133 Epoch 00918 | Time(s) 0.0328 | Train Loss 0.0331 | Val Accuracy 0.8100 Epoch 00919 | Time(s) 0.0327 | Train Loss 0.0391 | Val Accuracy 0.8067 Epoch 00920 | Time(s) 0.0328 | Train Loss 0.0500 | Val Accuracy 0.8067 Epoch 00921 | Time(s) 0.0328 | Train Loss 0.0659 | Val Accuracy 0.8033 Epoch 00922 | Time(s) 0.0327 | Train Loss 0.0951 | Val Accuracy 0.7933 Epoch 00923 | Time(s) 0.0327 | Train Loss 0.0803 | Val Accuracy 0.7900 Epoch 00924 | Time(s) 0.0327 | Train Loss 0.0596 | Val Accuracy 0.7900 Epoch 00925 | Time(s) 0.0327 | Train Loss 0.0121 | Val Accuracy 0.7933 Epoch 00926 | Time(s) 0.0327 | Train Loss 0.0476 | Val Accuracy 0.7867 Epoch 00927 | Time(s) 0.0327 | Train Loss 0.0237 | Val Accuracy 0.7767 Epoch 00928 | Time(s) 0.0327 | Train Loss 0.0382 | Val Accuracy 0.7767 Epoch 00929 | Time(s) 0.0327 | Train Loss 0.0197 | Val Accuracy 0.7833 Epoch 00930 | Time(s) 0.0327 | Train Loss 0.0768 | Val Accuracy 0.7867 Epoch 00931 | Time(s) 0.0327 | Train Loss 0.0477 | Val Accuracy 0.7867 Epoch 00932 | Time(s) 0.0327 | Train Loss 0.1269 | Val Accuracy 0.7833 Epoch 00933 | Time(s) 0.0327 | Train Loss 0.0890 | Val Accuracy 0.7867 Epoch 00934 | Time(s) 0.0327 | Train Loss 0.0941 | Val Accuracy 0.7900 Epoch 00935 | Time(s) 0.0327 | Train Loss 0.0722 | Val Accuracy 0.7933 Epoch 00936 | Time(s) 0.0327 | Train Loss 0.0551 | Val Accuracy 0.7933 Epoch 00937 | Time(s) 0.0327 | Train Loss 0.0684 | Val Accuracy 0.7900 Epoch 00938 | Time(s) 0.0327 | Train Loss 0.0544 | Val Accuracy 0.7900 Epoch 00939 | Time(s) 0.0327 | Train Loss 0.0980 | Val Accuracy 0.7933 Epoch 00940 | Time(s) 0.0327 | Train Loss 0.0110 | Val Accuracy 0.7933 Epoch 00941 | Time(s) 0.0327 | Train Loss 0.0882 | Val Accuracy 0.7933 Epoch 00942 | Time(s) 0.0327 | Train Loss 0.0308 | Val Accuracy 0.7967 Epoch 00943 | Time(s) 0.0326 | Train Loss 0.0484 | Val Accuracy 0.7967 Epoch 00944 | Time(s) 0.0326 | Train Loss 0.1021 | Val Accuracy 0.7900 Epoch 00945 | Time(s) 0.0326 | Train Loss 0.0346 | Val Accuracy 0.7900 Epoch 00946 | Time(s) 0.0326 | Train Loss 0.0295 | Val Accuracy 0.7900 Epoch 00947 | Time(s) 0.0326 | Train Loss 0.0996 | Val Accuracy 0.7900 Epoch 00948 | Time(s) 0.0326 | Train Loss 0.0100 | Val Accuracy 0.7967 Epoch 00949 | Time(s) 0.0326 | Train Loss 0.0747 | Val Accuracy 0.7967 Epoch 00950 | Time(s) 0.0326 | Train Loss 0.0606 | Val Accuracy 0.7967 Epoch 00951 | Time(s) 0.0326 | Train Loss 0.0346 | Val Accuracy 0.7933 Epoch 00952 | Time(s) 0.0326 | Train Loss 0.0506 | Val Accuracy 0.7933 Epoch 00953 | Time(s) 0.0326 | Train Loss 0.1063 | Val Accuracy 0.7967 Epoch 00954 | Time(s) 0.0326 | Train Loss 0.0155 | Val Accuracy 0.7900 Epoch 00955 | Time(s) 0.0326 | Train Loss 0.0573 | Val Accuracy 0.7933 Epoch 00956 | Time(s) 0.0326 | Train Loss 0.0507 | Val Accuracy 0.7867 Epoch 00957 | Time(s) 0.0326 | Train Loss 0.0145 | Val Accuracy 0.7900 Epoch 00958 | Time(s) 0.0326 | Train Loss 0.0160 | Val Accuracy 0.7900 Epoch 00959 | Time(s) 0.0326 | Train Loss 0.1181 | Val Accuracy 0.7900 Epoch 00960 | Time(s) 0.0326 | Train Loss 0.0453 | Val Accuracy 0.7867 Epoch 00961 | Time(s) 0.0326 | Train Loss 0.0274 | Val Accuracy 0.7833 Epoch 00962 | Time(s) 0.0326 | Train Loss 0.0560 | Val Accuracy 0.7833 Epoch 00963 | Time(s) 0.0326 | Train Loss 0.0439 | Val Accuracy 0.7867 Epoch 00964 | Time(s) 0.0326 | Train Loss 0.0251 | Val Accuracy 0.7933 Epoch 00965 | Time(s) 0.0326 | Train Loss 0.0314 | Val Accuracy 0.7900 Epoch 00966 | Time(s) 0.0326 | Train Loss 0.0319 | Val Accuracy 0.7867 Epoch 00967 | Time(s) 0.0326 | Train Loss 0.0636 | Val Accuracy 0.7933 Epoch 00968 | Time(s) 0.0326 | Train Loss 0.0379 | Val Accuracy 0.8033 Epoch 00969 | Time(s) 0.0326 | Train Loss 0.0210 | Val Accuracy 0.8033 Epoch 00970 | Time(s) 0.0326 | Train Loss 0.0190 | Val Accuracy 0.7933 Epoch 00971 | Time(s) 0.0326 | Train Loss 0.0258 | Val Accuracy 0.7900 Epoch 00972 | Time(s) 0.0326 | Train Loss 0.0727 | Val Accuracy 0.7800 Epoch 00973 | Time(s) 0.0326 | Train Loss 0.0525 | Val Accuracy 0.7800 Epoch 00974 | Time(s) 0.0326 | Train Loss 0.0470 | Val Accuracy 0.7800 Epoch 00975 | Time(s) 0.0326 | Train Loss 0.0380 | Val Accuracy 0.7800 Epoch 00976 | Time(s) 0.0326 | Train Loss 0.0358 | Val Accuracy 0.7833 Epoch 00977 | Time(s) 0.0326 | Train Loss 0.0711 | Val Accuracy 0.7900 Epoch 00978 | Time(s) 0.0326 | Train Loss 0.0455 | Val Accuracy 0.7967 Epoch 00979 | Time(s) 0.0326 | Train Loss 0.0503 | Val Accuracy 0.8000 Epoch 00980 | Time(s) 0.0326 | Train Loss 0.0349 | Val Accuracy 0.8033 Epoch 00981 | Time(s) 0.0326 | Train Loss 0.1188 | Val Accuracy 0.8067 Epoch 00982 | Time(s) 0.0325 | Train Loss 0.0288 | Val Accuracy 0.7933 Epoch 00983 | Time(s) 0.0325 | Train Loss 0.0664 | Val Accuracy 0.7967 Epoch 00984 | Time(s) 0.0325 | Train Loss 0.0596 | Val Accuracy 0.7967 Epoch 00985 | Time(s) 0.0325 | Train Loss 0.0234 | Val Accuracy 0.7967 Epoch 00986 | Time(s) 0.0325 | Train Loss 0.0365 | Val Accuracy 0.7933 Epoch 00987 | Time(s) 0.0325 | Train Loss 0.0340 | Val Accuracy 0.7900 Epoch 00988 | Time(s) 0.0325 | Train Loss 0.0563 | Val Accuracy 0.7900 Epoch 00989 | Time(s) 0.0325 | Train Loss 0.0197 | Val Accuracy 0.7867 Epoch 00990 | Time(s) 0.0325 | Train Loss 0.0596 | Val Accuracy 0.7833 Epoch 00991 | Time(s) 0.0325 | Train Loss 0.0366 | Val Accuracy 0.7833 Epoch 00992 | Time(s) 0.0325 | Train Loss 0.0715 | Val Accuracy 0.7833 Epoch 00993 | Time(s) 0.0325 | Train Loss 0.0647 | Val Accuracy 0.7867 Epoch 00994 | Time(s) 0.0325 | Train Loss 0.0432 | Val Accuracy 0.7900 Epoch 00995 | Time(s) 0.0325 | Train Loss 0.0657 | Val Accuracy 0.7900 Epoch 00996 | Time(s) 0.0325 | Train Loss 0.0214 | Val Accuracy 0.7900 Epoch 00997 | Time(s) 0.0325 | Train Loss 0.0506 | Val Accuracy 0.7833 Epoch 00998 | Time(s) 0.0325 | Train Loss 0.0817 | Val Accuracy 0.7833 Epoch 00999 | Time(s) 0.0325 | Train Loss 0.0073 | Val Accuracy 0.7800 Test Accuracy 0.7890
Trained this way our GCN based on polynomials of the laplacian is a black box. Fortunately, however, the only difference between this shallow model and our previous classifier is the way we chose the filter coefficients.
Let's see what the network learned. Print the coefficients of the learned filter.
coeff_gcn = model.pol_weights.detach().numpy()
print(coeff_gcn)
[-12.998725 16.839718 -1.9868492 -2.7973063]
To interpret the model we can plot the frequency response of the learned filter.
plt.semilogy(lam, np.abs(polynomial_graph_filter_response(coeff_gcn, lam)))
plt.xlabel('$\lambda$')
plt.ylabel('Spectral response (db)')
Text(0, 0.5, 'Spectral response (db)')
As we said, the whole classification pipeline of the previous exercise is identical to the one we tried before: Graph filtering + Logistic regression. The only difference lies in the way we chose the filter coefficients. First we were choosing them manually, and now, we let PyTorch
find them for us. However, if everything is correct we should be able to use this filter to construct new hand-crafted features and train a logistic regression model that achieves good accuracy on the training set. Let's do that!
Use the learned coefficients to train a new feature extractor:
graph_gcn_filter = polynomial_graph_filter(coeff_gcn, laplacian)
Let's extract the new features by filtering the data:
features_gcn = graph_gcn_filter @ features.numpy()
train_features_gcn = features_gcn[train_mask,:]
train_labels = labels[train_mask]
val_features_gcn = features_gcn[val_mask,:]
val_labels = labels[val_mask]
test_features_gcn = features_gcn[test_mask,:]
test_labels = labels[test_mask]
Train a logistic regression on these features:
log_reg_gcn = LogisticRegression(penalty='l2', multi_class="auto", solver="liblinear", C=1e4, fit_intercept=False, max_iter=1000)
log_reg_gcn.fit(train_features_gcn, train_labels)
LogisticRegression(C=10000.0, class_weight=None, dual=False, fit_intercept=False, intercept_scaling=1, l1_ratio=None, max_iter=1000, multi_class='auto', n_jobs=None, penalty='l2', random_state=None, solver='liblinear', tol=0.0001, verbose=0, warm_start=False)
Finally, let's evaluate this model:
train_acc = log_reg_gcn.score(train_features_gcn, train_labels)
val_acc = log_reg_gcn.score(val_features_gcn, val_labels)
test_acc = log_reg_gcn.score(test_features_gcn, test_labels)
print('Train accuracy {:.4f} | Validation accuracy {:.4f} | Test accuracy {:.4f}'.format(train_acc, val_acc, test_acc))
Train accuracy 1.0000 | Validation accuracy 0.7867 | Test accuracy 0.7890
The performance of this model may not be exactly the same as the one obtained with Pytorch. What are the differences in the training procedure that can explain this gap?
Solution : The model is the same in the two cases: it is a logistic regression composed with a Laplacian polynomial. However, there are two differences: