Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
2
2023-232
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
T.H.C. Heshan
2023-232
Commits
808d3c2b
Commit
808d3c2b
authored
May 26, 2023
by
T.H.C. Heshan
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Update model.py
parent
054d3337
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
0 deletions
+6
-0
model.py
model.py
+6
-0
No files found.
model.py
View file @
808d3c2b
...
...
@@ -102,16 +102,22 @@ norm_layer.adapt(data=train_spectrogram_ds.map(map_func=lambda spec, label: spec
model
=
models
.
Sequential
([
layers
.
Input
(
shape
=
input_shape
),
# Downsample the input.
#This layer resizes the input to a target size of 32x32. It is commonly used to ensure a consistent input shape for subsequent layers.
layers
.
Resizing
(
32
,
32
),
# Normalize.
#This is the normalization layer that you previously adapted with the training data.
norm_layer
,
#This is a convolutional layer with 32 filters, a kernel size of 3x3, and ReLU activation. It performs convolutional operations on the input.
layers
.
Conv2D
(
32
,
3
,
activation
=
'relu'
),
layers
.
Conv2D
(
64
,
3
,
activation
=
'relu'
),
#This layer performs max pooling, which reduces the spatial dimensions of the input data.
layers
.
MaxPooling2D
(),
layers
.
Dropout
(
0.25
),
#This layer flattens the input data into a 1-dimensional vector, preparing it for the fully connected layers.
layers
.
Flatten
(),
layers
.
Dense
(
128
,
activation
=
'relu'
),
layers
.
Dropout
(
0.5
),
#This is the output layer with num_labels units, representing the number of output classes in your classification problem.
layers
.
Dense
(
num_labels
),
])
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment