Renzym Education

We intend to upload videos covering various courses as well as tutorials on applying theory in practice. Most of these videos are collaborative work of Abasyn University Islamabad Campus and RENZYM.

Facebook: https://www.facebook.com/renzymeducation
Abasyn Islamabad: http://www.abasynisb.edu.pk

Cover for Renzym Education
526
I have tried to sum up complete Machine Learning course in 5 hours in this video. In addition to topics covered in original famous Andrew Ng’s Coursera course (Linear and Logistic Regression, Regularization, Neural networks, SVMs, Clustering, PCA, Recommender systems, Decision Trees) we also covered CNNs (Convolutional Neural Networks) which is part of Deep learning specialization course.This is especially useful for lazy students who want to cover the course on last day before the exams 🙂 (I don’t intend to promote such behavior). But my main objective was that a 30 lectures course becomes too big a commitment for most people. Now they can go through a shorter course instead of the complete course (they can fall back on specific videos if some topic is unclear, the link to course playlist is in the video description. course material like slides and assignments link is there too — Needless to say that without doing the assignments we can’t retain concepts for long). Its also useful for people preparing for interviews for Machine learning entry level jobs. This video as well as the course videos are all time tagged i.e. you can always jump to relevant section of a video instead of watching complete video by clicking time tag of the topics in video description.#machinelearning #neuralnetworks www.youtube.com/watch?v=HGYWEOiQgWM See MoreSee Less
View on Facebook
We dicsussed Face recognition and Face verification problems in this video. How we can create a Siamese network and use either Triplet loss function or treat it as a binary classification problem. The ideas were from famous DeepFace and FaceNet architectures. We followed it up with Neural style transfer example which can be used to create machine generated Art. Along the way, we discussed how we can visualize what features the deep layers are learning.www.youtube.com/watch?v=jmB4r7R35tE&list=PLoVsjlwzzlUT74osi5McUu7H80L6X6nzB&index=27 See MoreSee Less
View on Facebook
In this week’s machine learning lecture we covered main ideas in YOLO algorithm for Object detection. We started with landmark detection, discussed convolutional implementation of sliding window which is much faster and computationally less complex. Then we discussed intersection over union (IoU) and used it for Non-Max suppression (removing duplicate detections). Finally we discussed acnhor boxes to detect objects of different aspect ratios and put it all together as YOLO algorithm.www.youtube.com/watch?v=yUJKTwlHE-8 See MoreSee Less
View on Facebook
In this week’s lecture we covered 1×1 convolutions. The name confused me when I first heard the term as looked like simply a scaling, but actually its not. How 1×1 convolutions are used to construct Inception layers, which are further used in the GoogLeNet network. It was followed by practical advice on how to use transfer learning and data augmentation to get going instead of starting from scratch.www.youtube.com/watch?v=nDfu3WODvOM See MoreSee Less
View on Facebook
This week in machine learning, we started covering convolutional neural networks.We covered convolution operation with edge detection as example and then discussed padding, strided convolution and convolution over volumeswww.youtube.com/watch?v=kQV-BZ83U-4&list=PLoVsjlwzzlUT74osi5McUu7H80L6X6nzB&index=23Then we covered architecture of some famous CNNs like LeNet-5, AlexNet, Resnets, VGG-16 as examples.www.youtube.com/watch?v=v6YAzdYbHcc&list=PLoVsjlwzzlUT74osi5McUu7H80L6X6nzB&index=24 See MoreSee Less
View on Facebook
This week in machine learning, we covered Decicion trees, Random Forests and XGBoost.We introduced Decision trees in the first lecture, its learning model, using entropy to split data, and then extended it to incorporate categorical (instead of binary) and continuous features.www.youtube.com/watch?v=jSFjuMh79MY&list=PLoVsjlwzzlUT74osi5McUu7H80L6X6nzB&index=21In 2nd lecture, we discussed Tree ensembles i.e. Bagged decision trees, Random Forests and XGBoost and some advice on when to use what.www.youtube.com/watch?v=u1435qvkatU&list=PLoVsjlwzzlUT74osi5McUu7H80L6X6nzB&index=22 See MoreSee Less
View on Facebook

Youtube

Facebook