並び順

ブックマーク数

期間指定

  • から
  • まで

1 - 40 件 / 196件

新着順 人気順

machine_translationの検索結果1 - 40 件 / 196件

  • A Neural Network for Machine Translation, at Production Scale

    Posted by Quoc V. Le & Mike Schuster, Research Scientists, Google Brain Team Ten years ago, we announced the launch of Google Translate, together with the use of Phrase-Based Machine Translation as the key algorithm behind this service. Since then, rapid advances in machine intelligence have improved our speech recognition and image recognition capabilities, but improving machine translation remai

      A Neural Network for Machine Translation, at Production Scale
    • 論文解説 Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation (GNMT) - ディープラーニングブログ

      こんにちは Ryobot (りょぼっと) です. Google 翻訳の中身である GNMT (Google's Neural Machine Translation) [Wu, 2016] は良くいえばニューラル機械翻訳の王道を征 (ゆ) く手法であり,悪くいえば既存手法のいいとこ取りである.また,大規模対訳コーパス + モンスター級に巨大なモデル + 大量の GPU が一般化する契機にもなった.2016 年までの NMT を素早く把握するのに最適な教材と言える. WMT'14 の BLEU スコアは英仏: 39.9, 英独: 24.6 で第 5 位 (登場時 1 位) Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation [Yonghui Wu,

        論文解説 Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation (GNMT) - ディープラーニングブログ
      • Zero-Shot Translation with Google’s Multilingual Neural Machine Translation System

        Posted by Mike Schuster (Google Brain Team), Melvin Johnson (Google Translate) and Nikhil Thorat (Google Brain Team) In the last 10 years, Google Translate has grown from supporting just a few languages to 103, translating over 140 billion words every day. To make this possible, we needed to build and maintain many different systems in order to translate between any two languages, incurring signif

          Zero-Shot Translation with Google’s Multilingual Neural Machine Translation System
        • Peeking into the neural network architecture used for Google's Neural Machine Translation

          Peeking into the neural network architecture used for Google's Neural Machine Translation November 17, 2016 The Google Neural Machine Translation paper (GNMT) describes an interesting approach towards deep learning in production. The paper and architecture are non-standard, in many cases deviating far from what you might expect from an architecture you'd find in an academic paper. Emphasis is plac

            Peeking into the neural network architecture used for Google's Neural Machine Translation
          • [PDF]Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

            Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation Melvin Johnson, Mike Schuster, Quoc V. Le, Maxim Krikun, Yonghui Wu, Zhifeng Chen, Nikhil Thorat melvinp,schuster,qvl,krikun,yonghui,zhifengc,nsthorat@google.com Fernanda Viégas, Martin Wattenberg, Greg Corrado, Macduff Hughes, Jeffrey Dean Abstract We propose a simple, elegant solution to use a single Neural Ma

            • Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

              Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, with the potential to overcome many of the weaknesses of conventional phrase-based translation systems. Unfortunately, NMT systems are known to be computationally expensive both in training and in translation inference. Also, most NMT systems have difficulty with rare words. These issues have hindered NM

              • OpenNMT - Open-Source Neural Machine Translation

                OpenNMT is an open source ecosystem for neural machine translation and neural sequence learning. Started in December 2016 by the Harvard NLP group and SYSTRAN, the project has since been used in several research and industry applications. It is currently maintained by SYSTRAN and Ubiqus. OpenNMT provides implementations in 2 popular deep learning frameworks:

                • Statistical and Neural Machine Translation

                  This website contains resources for research in statistical and neural machine translation, i.e. the translation of text from one human language to another by a computer that learned how to translate from vast amounts of translated text. Events Conference on machine translation: 2022, 2021, 2020, 2019, 2018, 2017, 2016. Workshop on machine translation: 2015. 2014. 2013. 2012. 2011. 2010. 2009. 200

                  • A novel approach to neural machine translation

                    Language translation is important to Facebook’s mission of making the world more open and connected, enabling everyone to consume posts or videos in their preferred language — all at the highest possible accuracy and speed. Today, the Facebook Artificial Intelligence Research (FAIR) team published research results using a novel convolutional neural network (CNN) approach for language translation t

                      A novel approach to neural machine translation
                    • GitHub - tensorflow/nmt: TensorFlow Neural Machine Translation Tutorial

                      You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                        GitHub - tensorflow/nmt: TensorFlow Neural Machine Translation Tutorial
                      • Introduction to Neural Machine Translation with GPUs (part 1) | NVIDIA Technical Blog

                        Introduction to Neural Machine Translation with GPUs (part 1) Note: This is the first part of a detailed three-part series on machine translation with neural networks by Kyunghyun Cho. You may enjoy part 2 and part 3. Neural machine translation is a recently proposed framework for machine translation based purely on neural networks. This post is the first of a series in which I will explain a simp

                          Introduction to Neural Machine Translation with GPUs (part 1) | NVIDIA Technical Blog
                        • 論文解説 Depthwise Separable Convolution for Neural Machine Translation (SliceNet) - ディープラーニングブログ

                          こんにちは Ryobot (りょぼっと) です. テンソル分解は 2017 年の密かなブームだったかもしれない. 論文数は多くないが,テンソル分解を用いた手法が中規模言語モデル [1],大規模言語モデル [2],機械翻訳 (本紙) [3],動作認識 [4] で軒並み SOTA を達成している. Breaking the Softmax Bottleneck: A High-Rank RNN Language Model [1] Factorization tricks for LSTM networks [2] Depthwise Separable Convolutions for Neural Machine Translation [3] Learning Compact Recurrent Neural Networks with Block-Term Tensor Decompo

                            論文解説 Depthwise Separable Convolution for Neural Machine Translation (SliceNet) - ディープラーニングブログ
                          • Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

                            We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no change in the model architecture from our base system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. The rest of the model, which includes encoder, decoder and attention, rem

                            • Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)

                              Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) Translations: Chinese (Simplified), French, Japanese, Korean, Persian, Russian, Turkish Watch: MIT’s Deep Learning State of the Art lecture referencing this post May 25th update: New graphics (RNN animation, word embedding graph), color coding, elaborated on the final attention example. Note: The animations

                              • Machine Translation | Pangeanic

                                MACHINE TRANSLATION Translate more content, faster and more securely with our near-human-quality Machine Translation Discover PangeaMT, our AI Machine Translation software to help you optimize your company's translations and save up to 70%. Want to know how? Listed in Gartner Hype Cycle of NLP Technologies - Neural Machine Translation In Gartner's recent analysis on the risks and opportunities in

                                • Neural Machine Translation by Jointly Learning to Align and Translate

                                  Neural machine translation is a recently proposed approach to machine translation. Unlike the traditional statistical machine translation, the neural machine translation aims at building a single neural network that can be jointly tuned to maximize the translation performance. The models proposed recently for neural machine translation often belong to a family of encoder-decoders and consists of a

                                  • Mozilla releases local machine translation tools as part of Project Bergamot | The Mozilla Blog

                                    Mozilla releases local machine translation tools as part of Project Bergamot In January of 2019, Mozilla joined the University of Edinburgh, Charles University, University of Sheffield and University of Tartu as part of a project funded by the European Union called Project Bergamot. The ultimate goal of this consortium was to build a set of neural machine translation tools that would enable Mozill

                                      Mozilla releases local machine translation tools as part of Project Bergamot | The Mozilla Blog
                                    • Neural Machine Translation in Linear Time

                                      We present a novel neural network for processing sequences. The ByteNet is a one-dimensional convolutional neural network that is composed of two parts, one to encode the source sequence and the other to decode the target sequence. The two network parts are connected by stacking the decoder on top of the encoder and preserving the temporal resolution of the sequences. To address the differing leng

                                      • GitHub - neubig/nmt-tips: A tutorial about neural machine translation including tips on building practical systems

                                        You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                          GitHub - neubig/nmt-tips: A tutorial about neural machine translation including tips on building practical systems
                                        • A Neural Network for Machine Translation, at Production Scale

                                          Philosophy We strive to create an environment conducive to many different types of research across many different time scales and levels of risk. Learn more about our Philosophy Learn more

                                            A Neural Network for Machine Translation, at Production Scale
                                          • Phrase-Based & Neural Unsupervised Machine Translation

                                            Machine translation systems achieve near human-level performance on some languages, yet their effectiveness strongly relies on the availability of large amounts of parallel sentences, which hinders their applicability to the majority of language pairs. This work investigates how to learn to translate when having access to only large monolingual corpora in each language. We propose two model varian

                                            • 一般社団法人アジア太平洋機械翻訳協会 – Asia-Pacific Association for Machine Translation (AAMT)

                                              当協会は、機械翻訳の研究開発者、製造販売者および利用者の3者で構成されており、円滑なグローバルコミュニケーションが図られるよう、機械翻訳システムの開発・改良・啓蒙・普及を通じて機械翻訳の発展に努めております。 当サイトの翻訳版に関する注意事項

                                              • Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

                                                Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

                                                  Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
                                                • Building Your Own Neural Machine Translation System in TensorFlow

                                                  Posted by Thang Luong, Research Scientist, and Eugene Brevdo, Staff Software Engineer, Google Brain Team Machine translation – the task of automatically translating between languages – is one of the most active research areas in the machine learning community. Among the many approaches to machine translation, sequence-to-sequence ("seq2seq") models [1, 2] have recently enjoyed great success and ha

                                                    Building Your Own Neural Machine Translation System in TensorFlow
                                                  • Book: Statistical Machine Translation

                                                    Philipp Koehn Hardcover, 488 pages Publisher: Cambridge University Press ISBN-10: 0521874157 ISBN-13: 978-0521874151 Content (more slides will be added over the next weeks) Chapter 1: Introduction Chapter 2: Words, Sentences, Corpora Chapter 3: Probability Theory Chapter 4: Word-Based Models slides Chapter 5: Phrase-Based Models slides Chapter 6: Decoding slides Chapter 7: Language Models slides C

                                                    • Philipp Koehn の Statistical Machine Translation - 武蔵野日記

                                                      機械翻訳について書いたので、ついでに本の紹介。2007年くらいからずっと in press だった気がするのだが、ようやく先月出版されたので、購入。 Statistical Machine Translation 作者: Philipp Koehn出版社/メーカー: Cambridge University Press発売日: 2009/12/17メディア: ハードカバー購入: 1人 クリック: 12回この商品を含むブログ (16件) を見る著者の Philipp Koehn は統計的機械翻訳の Pharaoh の開発で有名であり、最近はオープンソース(GPL)の Moses という翻訳ツールの開発で著名である。ちなみに、いずれのツールキットも、機械翻訳の世界ではデファクトスタンダード(数年前までは Pharaoh が使われていて、Moses が開発されてからは Pharaoh の座は M

                                                        Philipp Koehn の Statistical Machine Translation - 武蔵野日記
                                                      • Foundations of Statistical Machine Translation: Past, Present and Future

                                                        Foundations of Statistical Machine Translation: Past, Present and Future Taro Watanabe taro.watanabe @ nict.go.jp http://mastarpj.nict.go.jp/~t_watana/ 1 20 years history • Statistical Machine Translation (SMT) started from Brown et al. (1990) • Is SMT matured? • Real service:Web-based (Google, Microsoft), mobile phone (NICT) • Promising gains from Tree-based approaches • Syntax-based SMT in {tre

                                                        • 一般社団法人アジア太平洋機械翻訳協会 – Asia-Pacific Association for Machine Translation (AAMT)

                                                          当協会は、機械翻訳の研究開発者、製造販売者および利用者の3者で構成されており、円滑なグローバルコミュニケーションが図られるよう、機械翻訳システムの開発・改良・啓蒙・普及を通じて機械翻訳の発展に努めております。 当サイトの翻訳版に関する注意事項

                                                          • Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

                                                            This tutorial introduces a new and powerful set of techniques variously called "neural machine translation" or "neural sequence-to-sequence models". These techniques have been used in a number of tasks regarding the handling of human language, and can be a powerful tool in the toolbox of anyone who wants to model sequential data of some sort. The tutorial assumes that the reader knows the basics o

                                                            • LibreTranslate - Free and Open Source Machine Translation API

                                                              Open Source Machine Translation API Self-Hosted. Offline Capable. Easy to Setup.

                                                                LibreTranslate - Free and Open Source Machine Translation API
                                                              • Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

                                                                In this paper, we propose a novel neural network model called RNN Encoder-Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixed-length vector representation, and the other decodes the representation into another sequence of symbols. The encoder and decoder of the proposed model are jointly trained to maximize the conditional probability of

                                                                • Zero-Shot Translation with Google’s Multilingual Neural Machine Translation Syst

                                                                  Philosophy We strive to create an environment conducive to many different types of research across many different time scales and levels of risk. Learn more about our Philosophy Learn more

                                                                    Zero-Shot Translation with Google’s Multilingual Neural Machine Translation Syst
                                                                  • Deep Learning for Natural Language Processing and Machine Translation

                                                                    • GitHub - LibreTranslate/LibreTranslate: Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.

                                                                      You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                                        GitHub - LibreTranslate/LibreTranslate: Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.
                                                                      • GitHub - OpenNMT/OpenNMT-py: Open Source Neural Machine Translation and (Large) Language Models in PyTorch

                                                                        You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                                          GitHub - OpenNMT/OpenNMT-py: Open Source Neural Machine Translation and (Large) Language Models in PyTorch
                                                                        • Massive Exploration of Neural Machine Translation Architectures

                                                                          Neural Machine Translation (NMT) has shown remarkable progress over the past few years with production systems now being deployed to end-users. One major drawback of current architectures is that they are expensive to train, typically requiring days to weeks of GPU time to converge. This makes exhaustive hyperparameter search, as is commonly done with other neural network architectures, prohibitiv

                                                                          • Breaking Down the Language Barrier with Statistical Machine Translation

                                                                            Breaking Down the Language Barrier with Statistical Machine Translation Advanced Research Seminar I/III Graduate School of Information Science Nara Institute of Science and Technology January/february 2014 Instructor: Graham Neubig, Office hours: after class, or appointment by email (x@is.naist.jp where x=neubig) Course Description Machine translation is a technology to automatically translate fro

                                                                              Breaking Down the Language Barrier with Statistical Machine Translation
                                                                            • Machine Translation and Sequence to Sequence Models

                                                                              CS 11-731 Language Technologies Institute, School of Computer Science Carnegie Mellon University Tuesday/Thursday 1:30-2:50PM, GHC4102 Instructors/TAs: Instructor: Graham Neubig (gneubig@cs.cmu.edu) Office hours: Monday 4:00-5:00PM (GHC5409) TAs: Qinlan Shen and Dongyeop Kang (cs11-731-sp2017-tas@cs.cmu.edu) Office hours: Tuesday 3:00-4:00PM (Dongyeop@GHC5713), Wednesday 11:00-12:00AM (Qinlan@GHC6

                                                                              • Adversarial Neural Machine Translation

                                                                                In this paper, we study a new learning paradigm for Neural Machine Translation (NMT). Instead of maximizing the likelihood of the human translation as in previous works, we minimize the distinction between human translation and the translation given by an NMT model. To achieve this goal, inspired by the recent success of generative adversarial networks (GANs), we employ an adversarial training arc

                                                                                • Neural Machine Translation andSequence-to -sequence Models

                                                                                  Neural Machine Translation and Sequence-to-sequence Models: A Tutorial Graham Neubig Language Technologies Institute, Carnegie Mellon University 1 Introduction This tutorial introduces a new and powerful set of techniques variously called “neural machine translation” or “neural sequence-to-sequence models”. These techniques have been used in a number of tasks regarding the handling of human langu