Upgrade & Secure Your Future with DevOps, SRE, DevSecOps, MLOps!

We spend hours on Instagram and YouTube and waste money on coffee and fast food, but won’t spend 30 minutes a day learning skills to boost our careers.
Master in DevOps, SRE, DevSecOps & MLOps!

Learn from Guru Rajesh Kumar and double your salary in just one year.

Get Started Now!

NVIDIA NeMo: An Open-Source Toolkit For Developing State-Of-The-Art Conversational AI Models In Three Lines Of Code

Source: marktechpost.com

NVIDIA’s open-source toolkit, NVIDIA NeMo( Neural Models), is a revolutionary step towards the advancement of Conversational AI. Based on PyTorch, it allows one to build quickly, train, and fine-tune conversational AI models.

As the world is getting more digital, Conversational AI is a way to enable communication between humans and computers. The set of technologies behind some fascinating technologies like automated messaging, speech recognition, voice chatbots, text to speech, etc. It broadly comprises three areas of AI research: automatic speech recognition (ASR), natural language processing (NLP), and speech synthesis (or text-to-speech, TTS). 

Conversational AI has shaped the path of human-computer interaction, making it more accessible and exciting. The latest advancements in Conversational AI like NVIDIA NeMo help bridge the gap between machines and humans.

NVIDIA NeMo consists of two subparts: NeMo Core and NeMo Collections. NeMo Core deals with all models generally, whereas NeMo Collections deals with models’ specific domains. In Nemo’s Speech collection (nemo_asr), you’ll find models and various building blocks for speech recognition, command recognition, speaker identification, speaker verification, and voice activity detection. NeMo’s NLP collection (nemo_nlp) contains models for tasks such as question answering, punctuation, named entity recognition, and many others. Finally, in NeMo’s Speech Synthesis (nemo_tts), you’ll find several spectrogram generators and vocoders, which will let you generate synthetic speech.

There are three main concepts in NeMo: model, neural module, and neural type. 

  • Models contain all the necessary information regarding training, fine-tuning, neural network implementation, tokenization, data augmentation, infrastructure details like the number of GPU nodes,etc., optimization algorithm, etc.
  • Neural modules are a sort of encoder-decoder architecture consisting of conceptual building blocks responsible for different tasks. It represents the logical part of a neural network and forms the basis for describing the model and its training process. Collections have many neural modules that can be reused whenever required.
  • Inputs and outputs to Neural Modules are typed with Neural Types. A Neural Type is a pair that contains the information about the tensor’s axes layout and semantics of its elements. Every Neural Module has input_types and output_types properties that describe what kinds of inputs this module accepts and what types of outputs it returns.

Even though NeMo is based on PyTorch, it can also be effectively used with other projects like PyTorch Lightning and Hydra. Integration with Lightning makes it easier to train models with mixed precision using Tensor Cores and can scale training to multiple GPUs and compute nodes. It also has some features like logging, checkpointing, overfit checking, etc. Hydra also allows the parametrization of scripts to keep it well organized. It makes it easier to streamline everyday tasks for users.

Related Posts

Deep Learning Restores Time-Ravaged Photos

Source: i-programmer.info Researchers have devised a novel deep learning approach to repairing the damage suffered by old photographic prints. The project is open source and a PyTorch Read More

Read More

THIS LATEST MODEL SERVING LIBRARY HELPS DEPLOY PYTORCH MODELS AT SCALE

Source: analyticsindiamag.com PyTorch has become popular within organisations to develop superior deep learning products. But building, scaling, securing, and managing models in production due to lack of Read More

Read More

AWS Announces Support for PyTorch with Amazon Elastic Inference

Source: datanami.com AWS has announced that the Amazon Elastic Inference is now compatible with PyTorch models. PyTorch, which AWS describes as a “popular deep learning framework that Read More

Read More

PyTorch 1.4 Release Introduces Java Bindings, Distributed Training

Source: infoq.com PyTorch, Facebook’s open-source deep-learning framework, announced the release of version 1.4. This release, which will be the last version to support Python 2, includes improvements to distributed training Read More

Read More

PyTorch and TensorFlow: Which ML Framework is More Popular in Academia and Industry

Source: infoq.com Horace He recently published an article summarising The State of Machine Learning Frameworks in 2019. The article utilizes several metrics to argue the point that PyTorch is quickly becoming the dominant Read More

Read More

Uber unveils a conversational AI platform called Plato

Source: siliconangle.com Uber Technology Inc. has open-sourced a conversational artificial intelligence engine called the Plato Research Dialog System that’s set to compete with similar offerings such as Google LLC’s Dialogflow, Read More

Read More
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x