Recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs. Natural language processing with deep learning is an important combination. Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation. Sequence models can be augmented using an attention mechanism. This algorithm will help model understand where it should focus its attention given a sequence of inputs. Learning about speech recognition and dealing with audio data.
-
Notifications
You must be signed in to change notification settings - Fork 0
isohamnemesis/Sequence-Models
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Coursera
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published