Skip to content

XredaX/Machine-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine Learning Course - Overview

This repository contains an overview and key topics covered in a machine learning course. The course covers a wide range of supervised and unsupervised learning algorithms, with a focus on both theory and practical applications.

Course Contents

  1. Linear Regression (Single Variable): Introduction to basic linear regression using a single feature.
  2. Linear Regression (Multiple Variables): Extending linear regression to handle multiple features.
  3. Gradient Descent and Cost Function: Learning optimization techniques for parameter tuning.
  4. Save Model Using Joblib and Pickle: Persisting trained models with Joblib and Pickle.
  5. Dummy Variables & One Hot Encoding: Handling categorical data with dummy variables and one-hot encoding.
  6. Training and Testing Data: Splitting data into training and testing sets for evaluation.
  7. Logistic Regression (Binary Classification): Introduction to logistic regression for binary classification tasks.
  8. Logistic Regression (Multiclass Classification): Extending logistic regression to multiclass classification problems.
  9. Decision Tree: Decision tree algorithms for classification and regression.
  10. SVM (Support Vector Machine): Support vector machine algorithm for classification and regression.
  11. Random Forest: Ensemble learning using the random forest algorithm.
  12. K-Fold Cross Validation: Cross-validation techniques to evaluate model performance.
  13. K-Means Clustering: Unsupervised learning using the K-means clustering algorithm.
  14. Naive Bayes Classifier Algorithm: Naive Bayes algorithm for classification tasks.
  15. Hyperparameter Tuning (GridSearchCV): Tuning model hyperparameters using GridSearchCV.
  16. L1 and L2 Regularization (Lasso, Ridge Regression): Preventing overfitting with L1 and L2 regularization.
  17. K-Nearest Neighbors (KNN) Classification: KNN algorithm for classification problems.
  18. Principal Component Analysis (PCA): Dimensionality reduction using PCA.
  19. Ensemble Learning - Bagging: Using ensemble learning techniques, specifically bagging.

Video Source

For a comprehensive guide and practical walkthrough of the topics listed above, refer to the YouTube Playlist.

About

This repository includes key concepts and implementations from a machine learning course, covering algorithms like regression, classification, clustering, and ensemble methods. It provides both theoretical insights and practical examples.

Topics

Resources

Stars

Watchers

Forks

Contributors