Natural Language Processing with TensorFlow Training Course

Course

In City Of London

Price on request

Description

  • Type

    Course

  • Location

    City of london

TensorFlow™ is an open source software library for numerical computation using data flow graphs.
SyntaxNet is a neural-network Natural Language Processing framework for TensorFlow.
Word2Vec is used for learning vector representations of words, called "word embeddings". Word2vec is a particularly computationally-efficient predictive model for learning word embeddings from raw text. It comes in two flavors, the Continuous Bag-of-Words model (CBOW) and the Skip-Gram model (Chapter 3.1 and 3.2 in Mikolov et al.).
Used in tandem, SyntaxNet and Word2Vec allows users to generate Learned Embedding models from Natural Language input.
Audience
This course is targeted at Developers and engineers who intend to work with SyntaxNet and Word2Vec models in their TensorFlow graphs.
After completing this course, delegates will:
understand TensorFlow’s structure and deployment mechanisms
be able to carry out installation / production environment / architecture tasks and configuration
be able to assess code quality, perform debugging, monitoring
be able to implement advanced production like training models, embedding terms, building graphs and logging

Facilities

Location

Start date

City Of London (London)
See map
Token House, 11-12 Tokenhouse Yard, EC2R 7AS

Start date

On request

Questions & Answers

Add your question

Our advisors and other users will be able to reply to you

Who would you like to address this question to?

Fill in your details to get a reply

We will only publish your name and question

Reviews

Subjects

  • Monitoring
  • Word
  • NLP
  • Motivation
  • Network
  • Installation
  • Quality
  • Logic
  • Web
  • Writing
  • Threading
  • Production
  • Ms Word

Course programme

Getting Started

  • Setup and Installation

TensorFlow Basics

  • Creation, Initializing, Saving, and Restoring TensorFlow variables
  • Feeding, Reading and Preloading TensorFlow Data
  • How to use TensorFlow infrastructure to train models at scale
  • Visualizing and Evaluating models with TensorBoard

TensorFlow Mechanics 101

  • Prepare the Data
    • Download
    • Inputs and Placeholders
  • Build the Graph
    • Inference
    • Loss
    • Training
  • Train the Model
    • The Graph
    • The Session
    • Train Loop
  • Evaluate the Model
    • Build the Eval Graph
    • Eval Output

Advanced Usage

  • Threading and Queues
  • Distributed TensorFlow
  • Writing Documentation and Sharing your Model
  • Customizing Data Readers
  • Using GPUs
  • Manipulating TensorFlow Model Files

TensorFlow Serving

  • Introduction
  • Basic Serving Tutorial
  • Advanced Serving Tutorial
  • Serving Inception Model Tutorial

Getting Started with SyntaxNet

  • Parsing from Standard Input
  • Annotating a Corpus
  • Configuring the Python Scripts

Building an NLP Pipeline with SyntaxNet

  • Obtaining Data
  • Part-of-Speech Tagging
  • Training the SyntaxNet POS Tagger
  • Preprocessing with the Tagger
  • Dependency Parsing: Transition-Based Parsing
  • Training a Parser Step 1: Local Pretraining
  • Training a Parser Step 2: Global Training

Vector Representations of Words

  • Motivation: Why Learn word embeddings?
  • Scaling up with Noise-Contrastive Training
  • The Skip-gram Model
  • Building the Graph
  • Training the Model
  • Visualizing the Learned Embeddings
  • Evaluating Embeddings: Analogical Reasoning
  • Optimizing the Implementation

Natural Language Processing with TensorFlow Training Course

Price on request