whatsapp

What you'll learn

  • Basic fundamentals of Data science, Machine learning, Computer vision, Natural language processing.
  • Data exploration, data preprocessing, handling missing values.
  • Feature engineering and exploratory data analysis.
  • Data visualization techniques.
  • Descriptive and inferential statistics, probability.
  • Working with Tableau and Power BI.
  • Cross – validation techniques.
  • Model selection, model training, model evaluation and model prediction.
  • Supervised learning, Unsupervised learning and Reinforcement learning.
  • Regression, Classification, Clustering, Association rules.
  • Linear regression, Logistic regression, Support vector machine, Naïve bias algorithm, Decision tree, Random forest, K-nearest neighbors and others .
  • Ensemble learning – bagging and boosting.
  • K-means, DBSCAN, Hierarchical clustering.
  • Content based filtering and Collaborative filtering.
  • Recommendation system and its working process.
  • AdaBoost, XGboost, Catboost, Gradient boosting, etc..
  • Deep learning and Neural networks.
  • Perceptron, Artificial neural networks, Feed forward neural network.
  • Back-propagation algorithm.
  • Weights, bias and tradeoff.
  • Overfitting and underfitting.
  • Activation functions, optimizers and loss / cost functions.
  • Epochs, step per epochs, batch size, val epochs, learning rate, etc.
  • Image and video processing with OpenCV and Mediapipe libraries.
  • Data augmentation and data annotation.
  • CNN architecture, hyper parameter tuning and transfer learning.
  • Generative adversarial networks.
  • Image classification, Object detection, Image segmentation, Face recognition, Pose estimation, Face generation, Image filtering, Art and Painting generation etc.
  • NLP components – Natural language understanding and Natural language generation.
  • NLP phases - lexical / morphological analysis, Syntactic analysis, Semantic analysis, Disclosure integration, Pragmatic analysis, Word Sense Disambiguation.
  • Various text preprocessing and feature extraction techniques.
  • Recurrent neural networks (RNN), LSTM, GRU, Encoder and Decoder, Transformers and Hugging face transformers.
  • Text classification, Text summarization, Text paraphrasing, Grammar correction, Language modeling, Topic modeling, Text generation, Question and Answer generation, Generation, Chatbots, Text translation.
  • Project management, development and deployment.
  • Web scraping techniques.
  • API development using FASTAPI framework.
  • Working with Sklearn, TensorFlow, Pandas, Numpy, Matplotlib, Seaborn, Plotly.
  • Hands on experience in real world projects.
  • Machine learning interview questions.
  • Machine learning mock interview preparation.
  • Helping resume creation.

Requirements

  • Carry your own laptop with decent configurations
  • Knowledge about Python programming language.

  • Course overview
  • Course outcome
  • Installing anaconda, jupyter notebook
  • Working with environments

  • Introduction of Machine learning
  • Importance of Machine learning
  • Industrial applications of Machine learning
  • Problem statement analysis
  • Numerical and categorical variables
  • Types of Machine learning
  • Machine learning pipeline

  • Explore various data exploration methods
  • Handling missing values methods
  • Working with Pandas, Numpy and Sklearn libraries
  • One-hot encoder and label encoder
  • Data normalization, data standardization and quantiles
  • Grid search, random search
  • Cross – validation techniques
  • Group by and pivot table
  • Perform exploratory data analysis (EDA)
  • Project 1
  • Project 2

  • Descriptive statistics - mean, mode, median, standard deviation, variance, etc.
  • Data distributions, skewness and kurtosis
  • Inferential statistics - various feature selection techniques, statistical tastings, hypothesis testing
  • Probability
  • Handling outliers
  • Dimensionality reduction techniques – PCA, LDA, etc.
  • Practical

  • Introduction of Data visualization
  • Importance of Data visualization
  • Explanation of various graphs / charts
  • Working with Tableau and PowerBI
  • Practical with Matplotlib, Seaborn and Plotly libraries
  • Project 1
  • Project 2

  • Getting started with Supervised learning
  • Types of Supervised learning
  • Explanation of Regression technique
  • Algorithms of Regression technique
  • Model evaluation methods for Regression
  • Use cases of Regression technique
  • Explanation of Classification technique
  • Algorithms of Classification technique
  • Model evaluation methods for Classification
  • Use cases of Classification technique
  • Linear Regression
  • Logistic Regressionn
  • Support vector machine
  • Naïve bias algorithm
  • Decision tree
  • Random forest
  • K-nearest neighbors and others
  • Project 1
  • Project 2
  • Project 3

  • Introduction of Ensemble learning
  • Types of Ensemble learning
  • Workflow of Ensemble learning
  • Explanation of Bagging technique
  • Algorithms of Bagging technique
  • Explanation of Boosting technique
  • Types of Boosting technique
  • Algorithms of Boosting technique
  • Adaboost
  • XGboost
  • Catboost
  • Gradient boosting and others
  • Project 1
  • Project 2

  • Getting started with Unsupervised learning
  • Types of Unsupervised learning
  • Explanation of Clustering technique
  • Algorithms of Clustering technique
  • Model evaluation methods for Clustering
  • Use cases of Clustering technique
  • K-Means, DBSCAN, Hierarchical clustering
  • Explanation of Association rules technique
  • Algorithms of Association rules technique
  • Model evaluation methods for Association rules
  • Use cases of Association rules technique
  • Content based filtering and Collaborative filtering
  • Recommendation system and its working process
  • Project 1
  • Project 2

  • Introduction of Reinforcement learning
  • Working process of Reinforcement learning
  • Algorithms of Reinforcement learning
  • Applications of Reinforcement learning
  • Practical

  • Introduction of Deep learning
  • Importance of Deep learning
  • Explanation about Neural networks
  • Types of Neural networks
  • Architecture of Neural networks
  • Workflow of Neural networks
  • Feed forword propagation
  • Back propagation
  • Weights and bias
  • Weights and bias initialization techniques
  • Handling overfitting and underfitting
  • Regularizations and dropouts
  • Batch normalization
  • Explanation on activation functions
  • Various types of activation functions
  • Explanation on loss / cost functions
  • Various types of loss / cost functions
  • Explanation on optimizers functions
  • Various types of optimizers functions
  • Learn about hyper parameters – epochs, step per epochs, batch size, val epochs, learning rate, etc.
  • Working with TensorFlow library
  • Building a custom Artificial neural networks
  • Project 1
  • Project 2

  • Introduction of Computer vision
  • Industrial and real-world applications of Computer vision
  • Importance of Computer vision
  • Computer vision pipeline
  • Getting started with images
  • Getting started with videos

  • Drawing functions
  • Basic operations on image
  • Arithmetic operations on images
  • Changing colorspaces
  • Geometric transformations of images
  • Image thresholding
  • Smoothing images
  • Morphological transformations
  • Image gradients
  • Canny edge detection
  • Image pyramids
  • Contours in OpenCV
  • Template matching
  • Image segmentation with watchrshed algorithm
  • Interactive foreground extraction using grabCut algorithm
  • Feature detection
  • Object detection
  • Project 1
  • Project 2

  • Various operations on video
  • Meanshift and camshift
  • Background subtraction
  • Video filters
  • Video analysis

  • Introduction of Mediapipe
  • Image processing with Mediapipe
  • Video processing with Mediapipe
  • Project 1
  • Project 2

  • Introduction of Data augmentation
  • Importance of Data augmentation
  • Data augmentation with augmentations
  • Data augmentation with imgaug
  • Data augmentation with Scikit-image
  • Project 1

  • Introduction of Data annotation
  • Importance of Data annotation
  • Data annotation with VGG annotator
  • Data annotation with labelImg
  • Data annotation with MakeSense.AI
  • Project 1

  • Introduction of CNN
  • CNN vs ANN
  • Importance of CNN
  • Architecture of CNN
  • Kernels, Channels, Filters, Stride and Padding
  • Convolutional, pooling and fully connected layers
  • Dropout, regularizations methods
  • Building custom convolutional neural network
  • Model fine tuning
  • Project 1

  • Introduction of Transfer learning
  • Working flow and importance of Transfer learning
  • Working with Pretrained models
  • VGG models
  • ResNet models
  • Inception models
  • Project 1
  • Project 2
  • Project 3

  • Introduction of Object detection
  • Object localization
  • Sliding window
  • Bounding boxes
  • Intersection over Union (IoU)
  • Non-Max suppression
  • Overlapping objects
  • Single shot detector (SSD)
  • Region with CNN (RCNN)
  • Fast RCNN
  • Faster RCNN
  • YOLO models
  • Deeplabv3
  • Project 1
  • Project 2
  • Project 3

  • Introduction of Image segmentation
  • Type of Image segmentations
  • Semantic segmentation
  • Instance segmentation
  • Mask R-CNN
  • UNet model
  • Detectron2
  • Project 1
  • Project 2

  • Getting started with GANs
  • Applications of GANs
  • Building custom GANs model
  • Working with DCGAN
  • Working with CycleGAN
  • Working with StyleGAN
  • Working with Pix2PixGAN
  • Working with SRGAN
  • Project 1
  • Project 2

  • Introduction of Natural language processing
  • Components of NLP – NLU and NLG
  • Importance of NLP
  • Why NLP difficult
  • Industrial and real-world applications of NLP
  • NLP pipeline

  • Lexical / morphological analysis
  • Syntactic analysis
  • Semantic analysis
  • Disclosure integration
  • Pragmatic analysis
  • Word Sense Disambiguation

  • Getting started with Text data
  • Basic operations on Text data
  • Splitting and joining strings
  • Working with Regular expression on Text (Re library)
  • Remove Punctuations, Digits and Stop words
  • Remove emojis and frequent words
  • Remove URLs, Unicode, ASCII codes and HTML tags
  • Spelling correction
  • Stemming and Lemmatization
  • Tokenization
  • Part of speech tagging (POS)
  • Name entity recognition (NER)
  • Chunking
  • Working with NLTK library
  • Working with SpaCy library
  • Working with Textblob library
  • Working with Gensim library
  • Project 1
  • Project 2

  • Bag of Word technique
  • TF-IDF technique
  • Word embedding – Word2Vec
  • Text Similarities – Euclidean distance, Cosine similarity and Jaccard similarity
  • Working with Word2Vec and Glove libraries
  • Project 1
  • Project 2

  • Introduction of RNN
  • RNN vs ANN
  • Importance of RNN
  • Architecture of RNN
  • Working process of RNN
  • Building custom RNN model
  • Model fine tuning
  • Limitation of RNN
  • Project 1

  • Introduction of LSTM
  • How LSTM overcome RNN limitation
  • Architecture of LSTM
  • Working process of LSTM
  • Building custom LSTM model
  • Model fine tuning
  • Limitation of LSTM
  • Project 1

  • Introduction of GRU
  • Architecture of GRU
  • Working process of GRU
  • Building custom GRU
  • Model fine tuning
  • Limitation of GRU
  • Project 1

  • Introduction of sequence-to-sequence Model
  • Understand the concept of Encoder and Decoder
  • Importance of Encoder and Decoder
  • Architecture of Encoder and Decoder
  • Use cases of Encoder and Decoder
  • Building custom Encoder and Decoder model
  • Model fine tuning
  • Limitation of Encoder and Decoder
  • Project 1

  • Introduction of Attention models
  • Types of Attention models
  • How attention models enhance the accuracy of Encoder and Decoder
  • Architecture of Attention models
  • Working process of Attention models
  • Building custom Attention models
  • Model fine tuning
  • Limitation of Attention models
  • Project 1

  • Introduction of Transformer
  • Architecture of Transformer
  • Working process of Transformer
  • Understand BERT Transformer and its architecture
  • Building custom transformer model
  • Model fine tuning
  • Project 1
  • Project 2

  • About hugging face
  • Introduction of hugging face transformers
  • Working with Pretrained transformers by hugging face
  • Model fine tuning
  • Roberta Transformer
  • Distil BART Transformer
  • T5 Transformer
  • Pegasus Transformer
  • GPT-J & GPT-2 Transformers
  • Project 1
  • Project 2
  • Project 3

  • About Gen AI
  • Introduction of LLMs, RAG and Stable Diffusion
  • Building LLM Applications using prompt engineering
  • Fine tuning LLMs from scratch
  • Building RAG ( Retrieval-Augmented Generation ) system with Langchain
  • Getting started with Stable Diffusion
  • Project 1
  • Project 2
  • Project 3

  • Introduction of OpenAI model
  • Web scraping techniques
  • FASTAPI development
  • GitHub management
  • Project deployment
  • Level up your Kaggle profile

Download The File

  • file name
  • file name