Beskjeder

Publisert 3. nov. 2025 08:42

Dear all, welcome back to a new week. We hope you've had a great weekend. 

The aim this week is to discuss convolutional neural networks (CNNs), with the basic mathematics, which parameters we need to deal with and how we can implement CNNs using either Tensorflow/Keras and/or PyTorch. The lecture notes this week contain examples thereof. These examples can for example be used in project 3 if you to plan to use CNNs. Rashcka's text contains a good documentation on how to use PyTorch for both NNs, CNNs and recurrent neural networks (topic next week). 

These week there are no exercises, we will focus only on work on project 2. If you not yet received feedback on project 1, it should be there by the end of the day today, please apologize the delay to those who have not yet received feedback on p1.

This week our plans are as follows:

Material for the lecture on Monday November 3, 202...

Publisert 27. okt. 2025 07:36

Dear all, welcome back to a new exciting week!

This week we plan to wrap up our discussions from last week on how to solve differential equations with neural networks (first lecture). Thereafter we start discussing  the whys and hows of convolutional neural networks (CNN) and parts of their mathematical foundations. Next week we will show how we can construct codes for CNNs (relevant for project 3) and how to deal with image classification. Thereafter, for our last weeks of the semester, we will focus on other deep learning methods such as recurrent neural networks (basis of LLMs and studies of timeseries) and finally autoencoders.

The exercises this week mimick those of week 39, that is getting started with the report for project 2. 

Here's the detailed plan with reading suggestions plus videos.

Plan for week 44

Material for the lecture Monday October 27, 2025....

Publisert 20. okt. 2025 08:05

Dear all, we hope you have had a great weekend. Here follows a short update about the plans for the coming week. 

Material for the lecture on Monday October 20, 2025.

  1. Reminder from last week, see also lecture notes from week 42 at https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/week42.html as well as those from week 41, see see https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/week41.html.

  2. Building our own Feed-forward Neural Network.

  3. Coding examples using Tensorflow/Keras and Pytorch examples. The Pytorch examples are adapted from Rashcka's text, see chapters 11-13..

    ...
Publisert 13. okt. 2025 09:28

Dear all, welcome back to a new week and new exciting possibilities! We trust you had a restful and enjoyable weekend.

This week, both for the lectures and the lab sessions, the aim is to continue our discussions on how to build a feed forward neural network code (FFNN).

Neural networks are the basic building stone of almost all advanced deep learning methods, from the standard FFNN, via convolutional neural networks to generative methods and reinforcement learning as well. Mastering the basic building blocks is thus central to our understanding of most of the deep learning methods. Deep learning methods are the focus of the rest of the semester. 

This week our focus is on 

Lecture October 13, 2025

  1. Building our own Feed-forward Neural Network and discussion of project 2

  2. Project 2 is available at ...

Publisert 5. okt. 2025 20:29

Dear all, first of all thx so much for heroic efforts with project 1. We are truly impressed by what you have been doing. Keep up the good work and best wishes to you all with the finalization of project 1.

This week we start discussing how to actually develop a neural network code. This will be the topic for the second project.  We will make the project  available next week and discuss it in more detail during the lectures and the lab sessions. This week we plan to start with a simpler set of exercises where you implement the feed-forward part of a code for a neural network. The exercises for this week can then in turn be used as a basis for the code in project 2. 

The plans this week are (see also links to various videos):

Material for the lecture on Monday October 6, 2025

  1. Neural Networks, setting up the basic steps, from the simple perc...

Publisert 29. sep. 2025 07:46

Dear all, we hope you've had a great weekend.

Here are  the updates and plans for this and the coming week.

Today we will continue our discussion of logistic regression that we started last week, with coding examples as well. We will repeat some of the essential elements and derivations. Logistic regression will serve as our stepping stone towards neural networks and deep learning methods. Next week we will devote our time to setting up a neural network code and we will also introduce automatic differentiation, which will allow us to compute gradients and derivatives for different cost functions, without having to encode directly the expressions for the derivatives.

The plans for this week, with some video recommendations, are:

Lecture Monday September 29, 2025

  1. Logistic regression and gradient descent, examples on how to code
    ...
Publisert 22. sep. 2025 06:56

Dear all, welcome back to FYS-STK and a new week. We hope you had a great weekend.

Here are the plans for this week.

For the lecture on Monday the 22nd, the plans are

  1. Resampling techniques, Bootstrap and cross validation and bias-variance tradeoff

  2. Logistic regression, our first classification encounter and a stepping stone towards neural networks

Readings and Videos, resampling methods

  1. Raschka et al, pages 175-192

  2. Hastie et al Chapter 7, here we recommend 7.1-7.5 and 7.10 (cross-validation) and 7.11 (bootstrap). See https://link.springer.com/book/10.1007/978-0-387-84858-7.

  3. Video on bi...

Publisert 18. sep. 2025 06:33

Dear all, the video from one of the lab sessions, where we discuss and derive the bias-variance tradeoff, is available at 

https://youtu.be/GBWc1abChKo

It may be of relevance for the exercises this week and obviously project 1.  Furthermore, the whiteboard notes for this week have been updated and you will find the derivation of the bias-variance tradeoff equations there as well, see https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2025/FYSSTKweek38.pdf

 

Finally, there was a typo in exercise 4a (this has been corrected now).  There should only be a single target, as we should not resample targets.

These lines:

predictions = n...

Publisert 14. sep. 2025 09:16

Dear all, welcome back to a new exciting week. We hope you all have enjoyed and are still enjoying the weekend.

The plans this week are to start with a discussion of a statistical interpretation of OLS, Ridge and Lasso (to be continued next week as well). 

This is relevant for the weekly exercises this week and the final part of project 1.  We will also discuss the so-called Bias-Variance tradeoff and resampling methods like cross-validation and bootstrap. The videos listed below may also be helpful. Take a look at them before the lecture if you can. I am particularly fond of the Statquest videos of Josh Starmer, see https://statquest.org/

Material for the lecture on Monday September 15.

  1. Statistical interpretation of OLS regression and other statistical properties

  2. Resampling techniques, Bootstrap and cross validation and bias-variance tradeoff (this ma...

Publisert 7. sep. 2025 13:32

Dear all, welcome back to FYS-STK3155/4155 and a new exciting week! Our plans this week are

to discuss the family of gradient descent methods which we need to implement in the project (parts c-e, including Lasso regression).

  1. Plain gradient descent (constant learning rate), reminder from last week with examples using OLS and Ridge

  2. Improving gradient descent with momentum

  3. Introducing stochastic gradient descent

  4. More advanced updates of the learning rate: ADAgrad, RMSprop and ADAM

Readings a