Hi, I’m Tal!
My research revolves around (Deep) Unsupervised Learning, Reinforcment Learning and Robotics, and it is done under the RL^2 lab, which is a part of the Control, Robotics and Machine Learning (CRML) lab.
I received my BSc (Cum Laude) and MSc (Summa Cum Laude) both in Electrical Engineering from the Technion, where the topic of my MSc thesis was “Deep Variational Semi-Supervised Novelty Detection”, also done under the supervision of Prof. Tamar.
During my time at the Technion, I’m also a teaching assistant in Machine Learning courses, for which I have written theory-based and hand-on tutorials which you can access in the “Teaching” section.
You can contact me at
taldanielm at campus dot technion dot ac dot il.
Publications and Pre-Prints
Soft-IntroVAE: Analyzing and Improving the Introspective Variational Autoencoder
Tal Daniel and Aviv Tamar
CVPR 2021 Oral
arXiv Pre-print, 2021
TL;DR - Stable adversarial training of VAEs without a discriminator, applicable for density estimation, image generation, image translation, Out-of-Distribution detection and many more.
Deep Variational Semi-Supervised Novelty Detection
Tal Daniel, Thanard Kurutach and Aviv Tamar
arXiv Pre-print, 2020
TL;DR - Principled incorporation of negative samples in the VAE framework for meaningful representations.
Beyond Credential Stuffing: Password Similarity Models Using Neural Networks
Bijeeta Pal, Tal Daniel, Rahul Chatterjee and Thomas Ristenpart
2019 IEEE Symposium on Security and Privacy (SP)
TL;DR - Cracking passwords with neural networks, but also defend against such attacks.
I teach several courses at the Technion, and I make my materials available on GitHub to everyone. All the tutorials include theory (with a lot of math) and code (Python), and are in a Jupyter Notebook format (which I really like!), but there is also a PDF version available. For the deep learning parts, PyTorch is my framework of choice.
EE - 046211 - Deep Learning
Topics: Single Neuron, PyTorch Basics, Optimization and Gradient Descent-based Algorithms, Automatic Differentiation (AutoDiff) and PyTorch's AutoGrad, Multilayer Neural Networks, Convolutional Neural Networks (CNNs), Sequential Tasks, Recureent Neural Networks (RNNs), Attention, Transformer, Training Methods, Bayesian Hyper-parameter Tuning with Optuna, Transfer Learning, Reperesentation and Self-Supervised Learning
EE - 046746 - Computer Vision
Topics: Image Processing Basics, PyTorch Basics, 2D Convolution, Convolutional Neural Networks, (Deep) Semantic Segmentation, (Deep) Object Detection, (Deep) Object Tracking, Generative Adversarial Network (GAN), 3D Deep Learning Basics
Spring 2020 (with Dahila Urbach), Spring 2021 (with Elias Nehme)
EE - 046202 - Unsupervised Learning and Data Analysis
Topics: Statistics (estimators, confidence intervals, hypothesis testing), Dimensionality Reduction (PCA, KPCA, t-SNE), Deep Generative Models (VAE, GAN), Clustering (K-Means, EM algorithm, Spectral Clustering)
Winter 2020, Winter 2021
CS - 236756 - Introduction to Machine Learning
Topics: Probability and Linear Algebra Basics, PCA, Feature Selection, Evaluation and Validation methods, Optimization, Decision Trees, Linear Regression, Linear Classifiers, EM algorithm, Boosting and Bagging, SVM, Deep Learning introduction, PAC Learning
Spring 2019, Spring 2020
Other Projects and Cool Stuff
Interpolation between airplane and car in the latent space of 3D Soft-IntroVAE
Python Implementation of Pencil Drawing by Sketch and Tone (Lu et al., NPAR 2012)
PyTorch implementation of Least-Squares DQN (Levine, Zahavy et al, NeurIPS 2017) with extras (DuelingDQN, Boosted FQI)
Bayesian Gradient Descent Algorithm (Zeno et al, 2019) Model for TensorFlow