ufal / npfl114 Goto Github PK
View Code? Open in Web Editor NEWMaterials for the Deep Learning -- ÚFAL course NPFL114
License: Creative Commons Attribution Share Alike 4.0 International
Materials for the Deep Learning -- ÚFAL course NPFL114
License: Creative Commons Attribution Share Alike 4.0 International
The expected result of the model in cags_segmentation.py
is a probabilistic mask of shape (batch_size, 224, 224, 1)
with values of elements being tf.float32
in 0, 1 range (i.e. probabilities).
Therefore, the true mask should have the corresponding form of (batch_size, 224, 224, 1)
with elements being binary 0/1 for presence of the mask and comparison using the 0.5 threshold.
The comments were updated to warn that the mask's dtype is tf.uint8
, however I would still expect the masks to have binary 0/255 values. Nevertheless, when I plotted one of the masks, the tensor was full of various unique numbers and the mask was not binary. This might be hard to discover at first, since the masks at https://ufal.mff.cuni.cz/~straka/courses/npfl114/2223/demos/cags_train.html seem to be binary. I propose to binarize the masks accordingly inside the dataset and not only inside MaskIoUMetric
, since otherwise, in order for the model training to work, the users have to correctly binarize the masks themselves, which, in turn, breaks the MaskIoUMetric
(at least it did for me, I have not yet discovered why that was the case (see the row at the bottom))
https://github.com/ufal/npfl114/blob/master/labs03/1-mnist.py
Hello, I tried to change your code to use the .tfrecord as data source file. but it doesm't work, can you help solve my problem? I put a question on the stackoverflow
https://stackoverflow.com/questions/46985411/tensorflow-read-tfrecord-without-a-graph
Hi,
outputs in the 'Examples' section of the task sequence_classification
are wildly different to what I'm getting locally. The outputs in the 'Tests' section are identical to mine and my solution also passed the tests in ReCodEx. Perhaps you used a different seed? I doubt there's such a huge difference between macOS and Linux.
Btw. it seems you swapped the order of 'Examples' and 'Tests' on the webpage :D
Example of the output I'm getting
$ python3 sequence_classification.py --rnn=LSTM --epochs=5 --hidden_layer=50
Epoch 1/5 loss: 0.6828 - accuracy: 0.5167 - val_loss: 0.6590 - val_accuracy: 0.5178
Epoch 2/5 loss: 0.6441 - accuracy: 0.5408 - val_loss: 0.6303 - val_accuracy: 0.5264
Epoch 3/5 loss: 0.6227 - accuracy: 0.5565 - val_loss: 0.6145 - val_accuracy: 0.5573
Epoch 4/5 loss: 0.6108 - accuracy: 0.5579 - val_loss: 0.6020 - val_accuracy: 0.5617
Epoch 5/5 loss: 0.5932 - accuracy: 0.5699 - val_loss: 0.5876 - val_accuracy: 0.5717
Cheers.
In the assignment tab, in the section for sgd_manual, hyper link for sgd_backpropagation redirects to former year of this course
( the link: https://ufal.mff.cuni.cz/courses/npfl114/2021-summer#sgd_backpropagation)
It should redirect to current year course so to https://ufal.mff.cuni.cz/courses/npfl114/2022-summer#sgd_backpropagation
or something in that nature.
Hello,
in file https://github.com/ufal/npfl114/blob/master/labs/01/pca_first.py, section:
# TODO: Now run `args.iterations` of the power iteration algorithm.
# Start with a vector of `cov.shape[0]` ones of type tf.float32 using `tf.ones`.
v = None
for i in range(args.iterations):
# TODO: In the power iteration algorithm, we compute
# 1. v = cov * v
# The matrix-vector multiplication can be computed using `tf.linalg.matvec`.
# 2. s = l2_norm(v)
# The l2_norm can be computed using `tf.linalg.norm`.
# 3. v = v / s
# The `v` is now the eigenvector of the largest eigenvalue, `s`. We now
# compute the explained variance, which is a ration of `s` and `total_variance`.
explained_variance = s / total_variance
would be nice to add something to the code block inside the for loop, for example:
# TODO: Now run `args.iterations` of the power iteration algorithm.
# Start with a vector of `cov.shape[0]` ones of type tf.float32 using `tf.ones`.
v = None
for i in range(args.iterations):
# TODO: In the power iteration algorithm, we compute
# 1. v = cov * v
# The matrix-vector multiplication can be computed using `tf.linalg.matvec`.
# 2. s = l2_norm(v)
# The l2_norm can be computed using `tf.linalg.norm`.
# 3. v = v / s
...
# The `v` is now the eigenvector of the largest eigenvalue, `s`. We now
# compute the explained variance, which is a ration of `s` and `total_variance`.
explained_variance = s / total_variance
will be valid code that can be executed without throwing an error.
Thanks :)
This comment (and a few others, in the other files as well) mention [EOW]
, but it is actually [BOW]
in the results. This is because BOW
and EOW
are both defined as 1
in the MorphoDataset. I don't know if this is done on purpose or not, but I am guessing that it's not because both have characters added here.
Sites hyperlink for bboxes_utils.py doesn't work. It gives out 404 error as lab06 doesn't contain the script.
In pca_first.py there is a line
from mnist import MNIST
but this mnist package is nowhere.
Even after installing tf and gym, it gives errors.
args.logdir not in base template, making people lose data by not saving it.
It's masked by the
os.environ.setdefault("TF_CPP_MIN_LOG_LEVEL", "2") # Report only TF errors by default
which i think should NOT by used.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.