![]() ![]() So, why must I know attention if all I want is to apply deep learning to natural language processing tasks? There is only one main theme in this book: to make a machine that can translate an English sentence into German. The goal of this book is to introduce to you the attention mechanisms that can extract key information from a sequence and show you how a transformer model, in which an attention mechanism is applied, is built and used. In fact, before you read this book, you should know some terms on language preprocessing, such as tokenization. This is not an introduction to natural language processing techniques. It is because human languages are ambiguous, and the sentence structure can be very complex. ![]() But unarguably, the most challenging part of all natural language processing problems is to find the accurate meaning of words and sentences. Not only is a lot of data cleansing needed, but multiple levels of preprocessing are also required depending on the algorithm you apply. Handling text and human language is a tedious job. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |