diff --git a/labs/l1-basic/nlp-l1-basic.ipynb b/labs/l1-basic/nlp-l1-basic.ipynb index 9ac5fb6c8c96ea725b6e24ef446de8a2152ae7c2..551e8fc5677de65839750732eddecf3b6531ecd6 100644 --- a/labs/l1-basic/nlp-l1-basic.ipynb +++ b/labs/l1-basic/nlp-l1-basic.ipynb @@ -509,18 +509,22 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Problem 3: Parameter initialisation (reflection; 3 points)" + "## Problem 3: Parameter initialisation (3 points)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The error surfaces that gradient search explores when training neural networks can be very complex. Because of this, it is important to choose “good†initial values for the parameters. In PyTorch, the weights of the embedding layer are initialised by sampling from the standard normal distribution $\\mathcal{N}(0, 1)$. Test how changing the standard deviation and/or the distribution affects the perplexity of your feed-forward language model. Find research articles that propose different types of initialisations. Write a short (150 words) report about your experiments and literature study. Use the following prompts:\n", + "The error surfaces explored when training neural networks can be very complex. Because of this, it is important to choose “good†initial values for the parameters. In PyTorch, the weights of the embedding layer are initialised by sampling from the standard normal distribution $\\mathcal{N}(0, 1)$. Test how changing the initialisation affects the perplexity of your feed-forward language model. Find research articles that propose different initialisation strategies.\n", "\n", - "* What different settings for the initialisation did you try? What results did you get?\n", + "Write a short (150 words) report about your experiments and literature search. Use the following prompts:\n", + "\n", + "* What different initialisation did you try? What results did you get?\n", "* How do your results compare to what was suggested by the research articles?\n", - "* What did you learn? How, exactly, did you learn it? Why does this learning matter?" + "* What did you learn? How, exactly, did you learn it? Why does this learning matter?\n", + "\n", + "You are allowed to consult sources for this problem if you appropriately cite them. If in doubt, please read the [Academic Integrity Policy](https://www.ida.liu.se/~TDDE09/logistics/policies.html#academic-integrity-policy)." ] }, { @@ -553,7 +557,8 @@ "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", - "pygments_lexer": "ipython3" + "pygments_lexer": "ipython3", + "version": "3.10.4" } }, "nbformat": 4,