deep-learning on Ethan Rosenthalhttps://www.ethanrosenthal.com/tags/deep-learning/Recent content in deep-learning on Ethan RosenthalHugo -- gohugo.ioen-USWed, 03 Nov 2021 00:00:00 +0000Alignimation: Differentiable, Semantic Image Registration with Korniahttps://www.ethanrosenthal.com/2021/11/03/alignimation/Wed, 03 Nov 2021 00:00:00 +0000https://www.ethanrosenthal.com/2021/11/03/alignimation/I had a kid at the start of the year.
Hold for applause
Well, not me personally, but my wife did.
I only tell you this in order to tell you that I took a picture of my wife every week that she was pregnant.
We thought maybe it’d be interesting to look back at these pictures one day. She wore the same outfit and faced the same direction for each picture, although the background occasionally changed.Optimal Peanut Butter and Banana Sandwicheshttps://www.ethanrosenthal.com/2020/08/25/optimal-peanut-butter-and-banana-sandwiches/Tue, 25 Aug 2020 00:00:00 +0000https://www.ethanrosenthal.com/2020/08/25/optimal-peanut-butter-and-banana-sandwiches/I was personally useless for most of the Spring of 2020. There was a period of time, though, after the peak in coronavirus cases here in NYC and before the onslaught of police violence here in NYC that I managed to scrounge up the motivation to do something other than drink and maniacally refresh my Twitter feed. I set out to work on something completely meaningless. It was almost therapeutic to work on a project with no value of any kind (insert PhD joke here).Time Series for scikit-learn People (Part III): Horizon Optimizationhttps://www.ethanrosenthal.com/2019/02/18/time-series-for-scikit-learn-people-part3/Mon, 18 Feb 2019 00:00:00 +0000https://www.ethanrosenthal.com/2019/02/18/time-series-for-scikit-learn-people-part3/In my previous posts in the “time series for scikit-learn people” series, I discussed how one can train a machine learning model to predict the next element in a time series. Often, one may want to predict the value of the time series further in the future. In those posts, I gave two methods to accomplish this. One method is to train the machine learning model to specifically predict that point in the future.spacecutter: Ordinal Regression Models in PyTorchhttps://www.ethanrosenthal.com/2018/12/06/spacecutter-ordinal-regression/Thu, 06 Dec 2018 00:00:00 +0000https://www.ethanrosenthal.com/2018/12/06/spacecutter-ordinal-regression/How would you build a machine learning algorithm to solve the following types of problems?
Predict which medal athletes will win in the olympics. Predict how a shoe will fit a foot (too small, perfect, too big). Predict how many stars a critic will rate a movie. If you reach into your typical toolkit, you’ll probably either reach for regression or multiclass classification. For regression, maybe you treat the number of stars (1-5) in the movie critic question as your target, and you train a model using mean squared error as your loss function.Matrix Factorization in PyTorchhttps://www.ethanrosenthal.com/2017/06/20/matrix-factorization-in-pytorch/Tue, 20 Jun 2017 00:00:00 +0000https://www.ethanrosenthal.com/2017/06/20/matrix-factorization-in-pytorch/Update 7/8/2019: Upgraded to PyTorch version 1.0. Removed now-deprecated Variable framework
Update 8/4/2020: Added missing optimizer.zero_grad() call. Reformatted code with black
Hey, remember when I wrote those ungodly long posts about matrix factorization chock-full of gory math? Good news! You can forget it all. We have now entered the Era of Deep Learning, and automatic differentiation shall be our guiding light.
Less facetiously, I have finally spent some time checking out these new-fangled deep learning frameworks, and damn if I am not excited.From Analytical to Numerical to Universal Solutionshttps://www.ethanrosenthal.com/2017/03/20/analytical-numerical-universal/Mon, 20 Mar 2017 00:00:00 +0000https://www.ethanrosenthal.com/2017/03/20/analytical-numerical-universal/I’ve been making my way through the recently released Deep Learning textbook (which is absolutely excellent), and I came upon the section on Universal Approximation Properties. The Universal Approximation Theorem (UAT) essentially proves that neural networks are capable of approximating any continuous function (subject to some constraints and with upper bounds on compute).
Meanwhile, I have been thinking about the modern successes of deep learning and how many computer vision researchers resisted the movement away from hand-defined features towards deep, uninterpretable neural networks.Using Keras' Pretrained Neural Networks for Visual Similarity Recommendationshttps://www.ethanrosenthal.com/2016/12/05/recasketch-keras/Mon, 05 Dec 2016 00:00:00 +0000https://www.ethanrosenthal.com/2016/12/05/recasketch-keras/To close out our series on building recommendation models using Sketchfab data, I will venture far from the previous [posts’]({{ ref “/blog/implicit-mf-part-2” >}}) factorization-based methods and instead explore an unsupervised, deep learning-based model. You’ll find that the implementation is fairly simple with remarkably promising results which is almost a smack in the face to all of that effort put in earlier.
We are going to build a model-to-model recommender using thumbnail images of 3D Sketchfab models as our input and the visual similarity between models as our recommendation score.