Deep Learning Training in Hyderabad

 

Deep Learning and Natural Language Processing (NLP) Training in Hyderabad.

Section – I (Deep Neural Networks, Convolutional Neural Networks)

  • Introduction to Neural Networks

  • Linear Regression Gradient Descent (Batch, Stochastic and Mini-Batch)

  • Logistic/Sigmoid neuron

    1. Forward propagation
    2. Back propagation
  • Neural Network Architecture

    1. Layers of a Deep Neural Network
    2. Back propagation
    3. Activation Functions (Sigmoid, Tanh, ReLU, Leaky ReLU, Softmax)
  • Introduction to TensorFlow

    1. Construction Phase
      1. tf.Variable
      2. tf.constant
  • tf.placeholder

  1. Tensor reshape, slice, type cast
  2. Variable collections – Global, Local, Trainable
  3. Initializing Variables
  1. Execution Phase
  2. Linear Regression with TensorFlow
  • Build hand written digit recognition model with TensorFlow

  • Regularizing Deep Neural Networks

    1. l1, l2 regularization
    2. Dropout regularization
  • Vanishing & Exploding Gradients

    1. Weight initializations (He/Xavier initialization)
  • Algorithm Optimizers

    1. Momentum – Exponentially weighted moving average
    2. Gradient Descent with Momentum
    3. Gradient Descent with RMSProp (Root Mean Squared Propagation)
    4. Gradient Descent with ADAM (Adaptive Momentum Estimation)
    5. Batch Normalization
  • Introduction to CNN (Convolutional Neural Networks), Computer Vision

  • Convolution and Edge detection

  • Padding, Striding Convolutions

  • Convolution Neural Network

    1. Edge Detection
    2. Padding
    3. Stride
    4. Pooling
  • ResNets (CNN build with Residual Block)

  • Inception Network (filter size, pooling, stride all combined layer)

  • Transfer Learning

  • Data Augmentation

  • Computer Vision

    1. Object Location
    2. Intersection over Union
    3. Anchor Boxes
    4. Normax Suppression
    5. YOLO Algorithm
    6. Object Detection
    7. Face Verification

Section – II (Natural Language Processing& Information Retrieval with Gensim, SpaCy and NLTK)

  • Understanding sentence structure
    1. Term-Document incidence matrix
    2. Inverted Index
    3. Text Normalization
      1. Tokenization
      2. Case folding
  • Synonyms, Homonyms
  1. Spelling mistakes
  2. Stop words
  3. Stemming
  • Lemmatization
  1. POS Tagging
  2. Context Free Grammar
  3. Dependency Parsing
  4. Named Entity Recognition(NER) tagging
  • Handling Phrase Queries (IR)
    1. Biword index
    2. Positional index
  • Spelling Correction
    1. SoundX algorithm
    2. Isolated words
      1. Edit Distance
      2. Weighted edit distance
  • N-Gram overlap (Jaccard coefficient)
  1. Context sensitive
  • Document search and Rank Retrieval model
    1. Term Frequency, Weighted Term Frequency, Inverse Document Frequency
    2. TF-IDF Scoring
    3. Euclidian distance
    4. Cosine similarity
  • Topic Modelling & Text Summarization
    1. Singular Value Decomposition
    2. Latent Dirichlet Allocation (LDA)
    3. Latent Semantic Analysis
  • Word2vec (BoW, Skip Gram), GloVe, Doc2vec

Section – III (Recurrent Neural Networks for Text Analysitcs)

  • Recurrent Neural Networks
  • Bidirectional Recurrent Neural Networks
  • Gated Recurrent Units (GRU)
  • Long short-term memory (LSTM)
  • Auto encoders
  • TBD – RNN solutions for text problems.
Close Menu