Daily log to track my progress on the 100 days of ML code challenge.
100 Day ML Challenge to learn and implement ML/DL concepts ranging from the basics to more advanced state of the art models.
- Started Machine Learning by Stanford University course on Coursera.
- Utilized TensorFlow tensors for matrix multiplication.
- Created an informative notebook for multivariate regression.
- Used the Seoul Bike Sharing Demand dataset found at UCI Machine Learning Repository for multivariate regression
- Utilized the Keras library through TensorFlow.
- Used a Sequential model with two hidden layers.
- Building a custom hand tuned regression model based on previous results.
- Trained using basic matrix operations and Adam optimizer
- Watched Stanford's CS229 lecture on Linear Regression and Gradient Decent taught by Andrew Ng.
- Watched Stanford's CS299 lecture on GDA & Naive Bayes.
- Noted the difference between Generative and Discriminative models.
- Started a mini-project on classifying Iris flowers using Naive Bayes.
- Learned a lot about Naive Bayes through several videos on youtube such as
- Finished the Iris Flower Classifier using Naive Bayes.
- Reached an accuracy of about 96%
- Learned a lot about Support Vector Machines by watching several videos on youtube such as
- Stanford's CS299 lecture on Support Vector Machines.
- Support Vector Machines, Clearly Explained!!!
- MIT 6.034 Artificial Intelligence lecture 16 on Learning: Support Vector Machines.
- Started a project on classifying Breast Cancer Tumors using SVM.
- Followed a tutorial on youtube by Sentdex on SVM.
- Received and accuracy in the range of around 97%
- Going back to the basics and approaching classification from a mathematical standpoint.
- Completed the Classification and Representation section in the Machine Learning course by Stanford on coursera.
- Watched Stanford's CS299 lecture on Kernels.
- Learned the representer theorem.
- Finished the Stanford CS299 lecture on Kernels.
- Learned about the complexity difference when using inner product.
- Watched Stanford's CS299 lecture on Data Splits, Models & Cross-Validation.
- Learned about
- Over fitting and under fitting in terms of bias and variance.
- The regularization technique.
- Finished watching the CS299 lecture on Cross Validation.
- Learned about
- How and when to use k-fold cross validation.
- How and when to use leave-out-out cross validation.
- Feature selection.
- Watched Stanford's CS299 lecture on Approx/Estimation Error.
- Learned about:
- Sampling Distributions
- Parameter View
- Bayes Error
- Approximation Error
- Estimation Error
- Finished up CS299 lecture on ERM.
- Uniform convergence
- Started watching Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
- Missclassificaiton and its issues with predicting the differences in certain cases.
- How cross-entropy tackles the downfall of missclassificaiton loss.
- Continued Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
- Regression Trees.
- Regularization of Decision Trees.
- Runtime for Decision Trees.
- Advantages and disadvantages of decision trees.
- Finished up Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
- How to combine different learning algorithms and average their results.
- How to utilize different training sets.
- Implemented decision trees on the iris dataset from UC Irvine Machine Learning Repository.
- Received and accuracy of ~97%.
- Started Stanford's CS299 lecture on Introduction to Neural Networks.
- Learned about:
- Equational form of neurons and models.
- Neural networks as a form of linear regression.
- Softmax
- Continued Stanford's CS299 lecture on Introduction to Neural Networks.
- Learned about:
- End to end learning
- Black box models
- Propagation equations
- Trained neural network model to classify images of clothing.
- Utilized Fashion MNIST dataset.
- Followed the TensorFlow guide.
- Started Stanford's CS299 lecture on Backprop & Improving Neural Networks.
- Learned how to improved Neural Networks.
- Vanishing/Exploding Gradient problem.
- Started Stanford's CS299 lecture on Debugging ML Models and Error Analysis.
- Methods to fixing the learning algorithm.
- Week 4 of Machine Learning course on coursera.
- Non-linear Hypotheses.
- Neurons and the Brain.
- Model representation.
- Continued Week 4 of Machine Learning course on coursera.
- Sentiment analysis neural network classifier.
- Utilized the IMDB dataset.
- Unsupervised learning.
- Started Stanford's CS299 lecture on Expectation-Maximization Algorithms.
- Continued Stanford's CS299 lecture on Expectation-Maximization Algorithms.
- The math behind K-means clustering.
- Implemented K-Means using scikit learn.
- Generated a random dataset for clustering.
- Used scikit learn K-Means.
- Some of the things I learned today:
- What are convolutional neural networks?
- What is the function of the CNN kernel?
- Continued to read up on ConvNet.
- Learned about the max pooling layer.
- Utilized the CIFAR10 dataset.
- Followed TensorFlow's Convolutional Neural Network tutorial.
- Some of the things I learned today:
- What are recurrent neural networks?
- What makes RNNs more powerful than other architectures?
- Learned about the different RNNs architectures.
- Explored the different applications of RNNs.
- Implemented RNN using keras.
- Trained it on the IMDB reviews dataset.
- Built a deep learning computer to train networks.
- Here are the basic specs:
- CPU: Ryzen 7 3800XT
- GPU: Nvidia 3080 FE
- RAM: 16GB 3600MHz
- Trained the model.
- Reached final accuracy of 0.855.
- Learned about:
- Why LSTMs were made.
- How LSTMs solved issues with RNNs
- Learned more about the applications of LSTMs.
- Dove deep into the architecture end of LSTMs.
- Utilized the New York Stock Exchange dataset on Kaggle.
- Used TensorFlow and Keras to implement the model.
- Learned:
- What are GRUs?
- Applications of GRUs?
- GRUs vs LSTMs.
- Learned how to implement a GRU model using TensorFlow and Keras.
- Started on a new mini-project to put the GRUs to use.
- Utilized the IMB stock dataset to predict stocks.
- Learned about:
- What Hopfield networks are.
- How to use Hopfield networks.
- How Hopfield networks improve on the RNN model.
- Learned about:
- What Boltzmann Machines are.
- Use cases for Boltzmann Machines
- The architecture of a Boltzmann Machine.
- Learned about:
- What Deep Belief Networks are.
- The general architecture of a DBN.
- Learned about:
- What Autoencoder networks are.
- How an Autoecoder functions.
- The components that make up an Autoencoder.
- Applications of Autoencoders.
- Utilized TensorFlow to implement autoencoders.
- Performed image denoising on the fasion mnist dataset.
- Utilized TensorFlow to implement autoencoders.
- Performed anomaly detection on the ECG5000 dataset.
- Learned about:
- What generative adversarial networks are.
- What GANs are used for.
- The architecture of a GAN.
- Used TensorFlow to implement GANs.
- Utilized the MNIST dataset for generating handwritten digits
- Continuation of the implementation I started yesterday.
- Worked on the loss & optimizer.
- Training the model took a lot longer than I was expecting.
- Trained the model for 50 epochs. Each epoch took around 1.5 min.
- Finished going through all The 10 Neural Network Architectures Machine Learning Researchers Need To Learn.
- Started Lesson 1 of the fast.ai course.
- Started Lesson 2 of the fast.ai course.
- Learned about:
- Project plan for model development.
- How to create datasets.
- Productionization of models.
- Working on my research project RecycleNet.
- Cleaned and preprocessed the images for the dataset.
- Checkout the entire project at RecycleNet.
- Setting up TensorFlow GPU to utilize my RTX 3080.
- Installed Docker and created a tensorflow image.
- Started a container and ran tensorflow code on juptyer using TensorFlow GPU
- Started Lesson 3 of the fast.ai course.
- Learned about:
- Data augmentation using the fastai API.
- How to create notebook apps.
- Deploying using Binder.
- Feedback loops and how they can affect models over time.
- Learned about deploying a model to a server using the TensorFlow Serving library.
- Watched Siraj Raval's video on How to Deploy a Tensorflow Model to Production.
- Worked on my RecycleNet research project.
- Configured to train ResNet-50 on our custom dataset.
- Watched lesson 4 of the fastai course on Stochastic Gradient Descent.
- Continued watching lesson for of the fastai course on Stochastic Gradient Descent.
- Started reading about chatbots using neural networks.
- There are two types of deep learning chatbot models:
- Retrieval-based Neural Network
- Generation-based Neural Network
- Read more articles on generation-based neural networks.
- Revisited sequence to sequence models that use an encoder/decoder architecture.
- Read the article Generative Model Chatbots which used a seq2seq to train a chatbot using several different datasets.
- Taking time off to study for my data structures midterm!
- Learned about graphs and their similarities to a representation of a neuron.
- Started working on the research.
- The paper consists of a custom resnet-50 and SVM model.
- Started reading about seq2seq models.
- Planing on creating a chatbot mini-project soon.
- Read the article Neural machine translation with attention written by the TensorFlow team about a seq2seq model.
- Worked on fine tuning a custom ResNet model with my research partner.
- Started coding Seq2Seq model.
- Fine tuned parameters by implementing grid search algorithm for SVM.
- Used the custom architecture for predicting items from the dataset.
- My partner and I presented our research project to the panel members.
- We finished our paper titled Classification of Recyclable Waste Generated in Indian Households.
- Looking forward to publishing our paper to an IEEE conference. m
- Read articles about the fundamentals of natural language processing.
- Learned about the different ways to understand text.
- Started to dive deeper into NLP.
- Learned about stemming and the applications of stemming.
- Learned the processo of lemmatisation.
- Explored the difference between lemmatisation and stemming.
- Learned about recommender systems.
- Read about Neural Collaborative Filtering and it's application in recommender systems.
- Started reading about various optimization algorithms for training neural networks.
- Dove deep into the Adam optimizer.
- Read about momentum which helps the gradient descent.
- Continued reading about optimizers by exploring NAG.
- Explored another optimizer which monotonically reduces the learning rate.
- Learned about the Adagrad optimizer which adapts the learning rate to individual features.
- Started exploring about the human brain through a neuroscience perspective.
- Read more about Donald Hoffman's case against reality.
- Started looking at differnt parts of the brain and how they function.
- Trying to draw the relationship between artificial neurons and a human brain.
- Started a tutorial on GCP.
- Learning how to use thier cloud services for machine learning.