{"id":4270,"date":"2026-01-16T11:19:28","date_gmt":"2026-01-16T11:19:28","guid":{"rendered":"https:\/\/devserver.admin.uoc.gr\/damsl\/?page_id=4270"},"modified":"2026-01-16T11:21:07","modified_gmt":"2026-01-16T11:21:07","slug":"damsl-273-introduction-to-deep-generative-modelling","status":"publish","type":"page","link":"https:\/\/mscs.uoc.gr\/damsl\/damsl-273-introduction-to-deep-generative-modelling\/","title":{"rendered":"DAMSL-273 Introduction to Deep Generative Modelling"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"4270\" class=\"elementor elementor-4270\" data-elementor-post-type=\"page\">\n\t\t\t\t<div class=\"elementor-element elementor-element-47e7b33 e-flex e-con-boxed e-con e-parent\" data-id=\"47e7b33\" data-element_type=\"container\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-2c87900 e-con-full e-flex e-con e-child\" data-id=\"2c87900\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-63fb3b4 elementor-widget elementor-widget-text-editor\" data-id=\"63fb3b4\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong style=\"font-size: 22px;\">Type<\/strong><\/p><p><strong style=\"font-size: 16px;\">Elective<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-6ff4fab e-con-full e-flex e-con e-child\" data-id=\"6ff4fab\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-cbbdc33 elementor-widget elementor-widget-text-editor\" data-id=\"cbbdc33\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong style=\"font-size: 22px;\">Course Code<\/strong><\/p><p><strong style=\"font-size: 16px;\">DAMSL-273<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4728792 e-con-full e-flex e-con e-child\" data-id=\"4728792\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e4dd7f1 elementor-widget elementor-widget-text-editor\" data-id=\"e4dd7f1\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong style=\"font-size: 22px;\">Teaching Semester<\/strong><\/p><p><strong style=\"font-size: 16px;\">B semester<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-74b417e e-con-full e-flex e-con e-child\" data-id=\"74b417e\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c4e7ca8 elementor-widget elementor-widget-text-editor\" data-id=\"c4e7ca8\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong style=\"font-size: 22px;\">ECTS Credits<\/strong><\/p><p><strong style=\"font-size: 16px;\">10<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-aea9e47 e-flex e-con-boxed e-con e-parent\" data-id=\"aea9e47\" data-element_type=\"container\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-ca881e9 e-grid e-con-full e-con e-child\" data-id=\"ca881e9\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-8169c04 elementor-widget elementor-widget-text-editor\" data-id=\"8169c04\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"et_pb_module et_pb_text et_pb_text_5_tb_body et_pb_text_align_left et_pb_bg_layout_light\"><div class=\"et_pb_text_inner\"><div class=\"custom-field course-field \"><h6>Student Performance Evaluation<\/h6><p>Homework and\/or Lab Assignments, Final Exam and\/or Project<\/p><\/div><\/div><\/div><div class=\"et_pb_module et_pb_text et_pb_text_6_tb_body et_pb_text_align_left et_pb_bg_layout_light\"><div class=\"et_pb_text_inner\"><div class=\"custom-field course-field \"><h6>Prerequisite Courses<\/h6><p>Linear Algebra , Probabilities<\/p><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-ca0c741 e-flex e-con-boxed e-con e-parent\" data-id=\"ca0c741\" data-element_type=\"container\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-6f13111 e-grid e-con-full e-con e-child\" data-id=\"6f13111\" data-element_type=\"container\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t<div class=\"elementor-element elementor-element-6361eff elementor-widget elementor-widget-text-editor\" data-id=\"6361eff\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div id=\"outcomes\" class=\"et_pb_module et_pb_text et_pb_text_1_tb_body et_pb_text_align_left et_pb_bg_layout_light\"><div class=\"et_pb_text_inner\"><div class=\"custom-field course-main \"><h6><span style=\"text-decoration: underline;\">Syllabus<\/span><\/h6><ul><li>Random variables, conditional probability, chain rule, Bayes theorem, central limit theorem, multivariate Gaussian, PDF of a transformed random variable, inverse transform sampling<\/li><li>Gaussian mixture models, Expectation-Maximization algorithm, maximum log-likelihood estimation, basics on Monte Carlo, Metropolis algorithm &amp; Monte Carlo Markov chains<\/li><li>Definition, time-series generation as the benchmark example, capturing long-range correlations, NN architectures (dilated CNNs, RNNs &amp; transformers), MLE optimization, discrete vs continuous state space, conditional DAR models and presentation of WaveNet, WaveRNN, PixelRNN, presentation of the architectures, present the training algorithm<\/li><li>Definition of an autoencoder, definition of VAE, MLE approximation, training algorithm, Denoising VAEs, beta-VAE, applications in scientific discovery<\/li><li>Definition, invertible transformations, tractable exact MLE, derive general formula, present the training algorithm, how it can extend the VAE model<\/li><li>what is information?, Shannon entropy, divergences (KLD, f, alpha, Renyi), variational representation\/duality formulas (scale well with dimension while density ratio type of methods collapse when applied to high dimensional datasets), probability distances (Wasserstein, MMD), examples with Gaussians and\/or the exponential family of distributions<\/li><li>Define it as a minimization problem, intractable due to unknown PDFs, use variational formulas, minimax optimization, vanilla GAN, training algorithm (stochastic gradient descend\/ascend), basic properties, Wasserstein GAN, Conditional GAN, DCGAN, BigGAN, MelGAN, InfoGAN, CycleGAN (as a more general adversarial type of learning)<\/li><li>Definition, architectures, training algorithms (score matching algorithm, contrastive estimation), product of experts<\/li><li>Definition, forward\/reverse process, MLE approximation, denoising DPMs, applications, DALL-E2 &amp; Imagen models for text-to-image generation<\/li><\/ul><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-caa1dca elementor-widget elementor-widget-text-editor\" data-id=\"caa1dca\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h6><span style=\"text-decoration: underline;\">Learning Outcomes<\/span><\/h6><div id=\"outcomes\" class=\"et_pb_module et_pb_text et_pb_text_2_tb_body et_pb_text_align_left et_pb_bg_layout_light\"><div class=\"et_pb_text_inner\"><div class=\"custom-field course-main \"><ul><li>Having attended and succeeded in the course, the student is able to describe the probabilistic foundations of deep generative models and gain knowledge about various model architectures, training algorithms and their underlying principles.<b>\u00a0<\/b><\/li><li>Having attended and succeeded in the course, the student is able to comprehend the application areas of deep generative models in fields like computer vision, language and speech processing.<\/li><li>Having attended and succeeded in the course, the student is capable of applying learned concepts to implement and train deep generative models and utilize these models for tasks such as synthetic tabular data generation, time-series synthesis and generative image processing.<\/li><li>Having attended and succeeded in the course, the student is able to analyze and compare different deep generative models, understanding their strengths and limitations and critically assess the performance of these models in various scenarios.<\/li><li>Having attended and succeeded in the course, the student is able to develop new approaches in deep generative modeling and distill information from research papers and practical demonstrations to create innovative solutions.<\/li><li>Having attended and succeeded in the course, the student is able to critically evaluate the effectiveness of deep generative models in real-world applications and assess the impact of these models in advancing the field of generative AI.<\/li><\/ul><\/div><\/div><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-396be16 e-flex e-con-boxed e-con e-parent\" data-id=\"396be16\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-93f3c11 elementor-widget elementor-widget-spacer\" data-id=\"93f3c11\" data-element_type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Type Elective Course Code DAMSL-273 Teaching Semester B semester ECTS Credits 10 Student Performance Evaluation Homework and\/or Lab Assignments, Final Exam and\/or Project Prerequisite Courses Linear Algebra , Probabilities Syllabus Random variables, conditional probability, chain rule, Bayes theorem, central limit theorem, multivariate Gaussian, PDF of a transformed random variable, inverse transform sampling Gaussian mixture models, Expectation-Maximization algorithm, maximum log-likelihood estimation, basics on Monte Carlo, Metropolis algorithm &amp; Monte Carlo Markov chains Definition, time-series generation as the benchmark example, capturing long-range correlations, NN architectures (dilated CNNs, RNNs &amp; transformers), MLE optimization, discrete vs continuous state space, conditional DAR models and presentation of WaveNet, WaveRNN, PixelRNN, presentation of the architectures, present the training algorithm Definition of an autoencoder, definition of VAE, MLE approximation, training algorithm, Denoising VAEs, beta-VAE, applications in scientific discovery Definition, invertible transformations, tractable exact MLE, derive general formula, present the training algorithm, how it can extend the VAE model what is information?, Shannon entropy, divergences (KLD, f, alpha, Renyi), variational representation\/duality formulas (scale well with dimension while density ratio type of methods collapse when applied to high dimensional datasets), probability distances (Wasserstein, MMD), examples with Gaussians and\/or the exponential family of distributions Define it as a minimization problem, intractable due to unknown PDFs, use variational formulas, minimax optimization, vanilla GAN, training algorithm (stochastic gradient descend\/ascend), basic properties, Wasserstein GAN, Conditional GAN, DCGAN, BigGAN, MelGAN, InfoGAN, CycleGAN (as a more general adversarial type of learning) Definition, architectures, training algorithms (score matching algorithm, contrastive estimation), product of experts Definition, forward\/reverse process, MLE approximation, denoising DPMs, applications, DALL-E2 &amp; Imagen models for text-to-image generation Learning Outcomes Having attended and succeeded in the course, the student is able to describe the probabilistic foundations of deep generative models and gain knowledge about various model architectures, training algorithms and their underlying principles.\u00a0 Having attended and succeeded in the course, the student is able to comprehend the application areas of deep generative models in fields like computer vision, language and speech processing. Having attended and succeeded in the course, the student is capable of applying learned concepts to implement and train deep generative models and utilize these models for tasks such as synthetic tabular data generation, time-series synthesis and generative image processing. Having attended and succeeded in the course, the student is able to analyze and compare different deep generative models, understanding their strengths and limitations and critically assess the performance of these models in various scenarios. Having attended and succeeded in the course, the student is able to develop new approaches in deep generative modeling and distill information from research papers and practical demonstrations to create innovative solutions. Having attended and succeeded in the course, the student is able to critically evaluate the effectiveness of deep generative models in real-world applications and assess the impact of these models in advancing the field of generative AI.<\/p>\n","protected":false},"author":194,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"inline_featured_image":false,"footnotes":""},"class_list":["post-4270","page","type-page","status-publish","hentry","post-no-thumbnail"],"acf":[],"_links":{"self":[{"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/pages\/4270","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/users\/194"}],"replies":[{"embeddable":true,"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/comments?post=4270"}],"version-history":[{"count":4,"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/pages\/4270\/revisions"}],"predecessor-version":[{"id":4298,"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/pages\/4270\/revisions\/4298"}],"wp:attachment":[{"href":"https:\/\/mscs.uoc.gr\/damsl\/wp-json\/wp\/v2\/media?parent=4270"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}