並び順

ブックマーク数

期間指定

  • から
  • まで

1 - 24 件 / 24件

新着順 人気順

machine_translationの検索結果1 - 24 件 / 24件

  • Mozilla releases local machine translation tools as part of Project Bergamot | The Mozilla Blog

    Mozilla releases local machine translation tools as part of Project Bergamot In January of 2019, Mozilla joined the University of Edinburgh, Charles University, University of Sheffield and University of Tartu as part of a project funded by the European Union called Project Bergamot. The ultimate goal of this consortium was to build a set of neural machine translation tools that would enable Mozill

      Mozilla releases local machine translation tools as part of Project Bergamot | The Mozilla Blog
    • LibreTranslate - Free and Open Source Machine Translation API

      Open Source Machine Translation API Self-Hosted. Offline Capable. Easy to Setup.

        LibreTranslate - Free and Open Source Machine Translation API
      • GitHub - LibreTranslate/LibreTranslate: Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.

        You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

          GitHub - LibreTranslate/LibreTranslate: Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.
        • Bergamot - a project to add and improve client-side machine translation in a web browser

          Machine Translation done locally in your browser. No need to send your translations out to the cloud. The Bergamot project implements free client-side translation software as a web extension for the open source Mozilla Firefox Browser Install the Extension Local Unlike cloud-based alternatives, translation is done locally using machine learning optimized for consumer hardware

          • Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation

            Much recent effort has been invested in non-autoregressive neural machine translation, which appears to be an efficient alternative to state-of-the-art autoregressive machine translation on modern GPUs. In contrast to the latter, where generation is sequential, the former allows generation to be parallelized across target token positions. Some of the latest non-autoregressive models have achieved

            • [DL輪読会]GraphSeq2Seq: Graph-Sequence-to-Sequence for Neural Machine Translation

              [DL輪読会]GraphSeq2Seq: Graph-Sequence-to-Sequence for Neural Machine Translation

                [DL輪読会]GraphSeq2Seq: Graph-Sequence-to-Sequence for Neural Machine Translation
              • GitHub - mozilla/firefox-translations-training: Training pipelines for Firefox Translations neural machine translation models

                You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                  GitHub - mozilla/firefox-translations-training: Training pipelines for Firefox Translations neural machine translation models
                • GitHub - SpecializedGeneralist/translator: A simple self-hostable Machine Translation service, powered by spaGO

                  A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

                    GitHub - SpecializedGeneralist/translator: A simple self-hostable Machine Translation service, powered by spaGO
                  • 200 languages within a single AI model: A breakthrough in high-quality machine translation

                    200 languages within a single AI model: A breakthrough in high-quality machine translation Meta AI has built a single AI model, NLLB-200, that is the first to translate across 200 different languages with state-of-the-art quality that has been validated through extensive evaluations for each of them. We’ve also created a new evaluation dataset, FLORES-200, and measured NLLB-200’s performance in ea

                      200 languages within a single AI model: A breakthrough in high-quality machine translation
                    • Exploring Massively Multilingual, Massive Neural Machine Translation

                      Philosophy We strive to create an environment conducive to many different types of research across many different time scales and levels of risk. Learn more about our Philosophy Learn more

                        Exploring Massively Multilingual, Massive Neural Machine Translation
                      • GitHub - UKPLab/EasyNMT: Easy to use, state-of-the-art Neural Machine Translation for 100+ languages

                        This package provides easy to use, state-of-the-art machine translation for more than 100+ languages. The highlights of this package are: Easy installation and usage: Use state-of-the-art machine translation with 3 lines of code Automatic download of pre-trained machine translation models Translation between 150+ languages Automatic language detection for 170+ languages Sentence and document trans

                          GitHub - UKPLab/EasyNMT: Easy to use, state-of-the-art Neural Machine Translation for 100+ languages
                        • Unsupervised Neural Machine Translation with Generative Language Models Only

                          We show how to derive state-of-the-art unsupervised neural machine translation systems from generatively pre-trained language models. Our method consists of three steps: few-shot amplification, distillation, and backtranslation. We first use the zero-shot translation ability of large pre-trained language models to generate translations for a small set of unlabeled sentences. We then amplify these

                          • Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

                            AbstractWe propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no changes to the model architecture from a standard NMT system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. Using a shared wordpiece vocabulary, our approach enables

                              Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
                            • Amazon Translate ranked as #1 machine translation provider by Intento | Amazon Web Services

                              AWS Machine Learning Blog Amazon Translate ranked as #1 machine translation provider by Intento Customer obsession, one of the key Amazon Leadership principles that guides everything we do at Amazon, has helped Amazon Translate be recognized as an industry leading neural machine translation provider. This year, Intento ranked Amazon Translate #1 on the list of top-performing machine translation pr

                                Amazon Translate ranked as #1 machine translation provider by Intento | Amazon Web Services
                              • Lingvanex | Machine Translation and Speech Recognition

                                Ultimate Language TechnologiesTransform Your Business with AI-Enhanced Speech and Translation Tools Machine TranslationTranslate text and documents with total security in 109 languages for a fixed price

                                  Lingvanex | Machine Translation and Speech Recognition
                                • Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)

                                  Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) Translations: Chinese (Simplified), French, Japanese, Korean, Persian, Russian, Turkish Watch: MIT’s Deep Learning State of the Art lecture referencing this post May 25th update: New graphics (RNN animation, word embedding graph), color coding, elaborated on the final attention example. Note: The animations

                                  • SNLP2019 Bridging the Gap between Training and Inference for Neural Machine Translation

                                    第11回最先端NLP勉強会で読んだ ACL 2019 Best Long Paper の Bridging the Gap between Training and Inference for Neural Machine Translation についての資料です。ニューラル機械翻訳や文書要約、キャプション生成などの文生成タスクにおいて生じる、Exposure Bias と呼ばれる偏りについての解決策を提示する論文です。

                                      SNLP2019 Bridging the Gap between Training and Inference for Neural Machine Translation
                                    • No Language Left Behind: Scaling Human-Centered Machine Translation - Meta Research | Meta Research

                                      No Language Left Behind: Scaling Human-Centered Machine TranslationarXiv Driven by the goal of eradicating language barriers on a global scale, machine translation has solidified itself as a key focus of artificial intelligence research today. However, such efforts have coalesced around a small subset of languages, leaving behind the vast majority of mostly low-resource languages. What does it tak

                                        No Language Left Behind: Scaling Human-Centered Machine Translation - Meta Research | Meta Research
                                      • GitHub - Helsinki-NLP/Opus-MT: Open neural machine translation models and web services

                                        Tools and resources for open translation services based on Marian-NMT trained on OPUS data using OPUS-MT-train (New: leaderboard) mainly SentencePiece-based segmentation mostly trained with guided alignment based on eflomal wordalignments pre-trained downloadable translation models (matrix view), CC-BY 4.0 license more freely available translation models from the Tatoeba translation challenge, CC-

                                          GitHub - Helsinki-NLP/Opus-MT: Open neural machine translation models and web services
                                        • From LLM to NMT: Advancing Low-Resource Machine Translation with Claude

                                          We show that Claude 3 Opus, a large language model (LLM) released by Anthropic in March 2024, exhibits stronger machine translation competence than other LLMs. Though we find evidence of data contamination with Claude on FLORES-200, we curate new benchmarks that corroborate the effectiveness of Claude for low-resource machine translation into English. We find that Claude has remarkable \textit{res

                                          • Participatory Research for Low-resourced Machine Translation: A Case Study in African Languages

                                            Research in NLP lacks geographic diversity, and the question of how NLP can be scaled to low-resourced languages has not yet been adequately solved. "Low-resourced"-ness is a complex problem going beyond data availability and reflects systemic problems in society. In this paper, we focus on the task of Machine Translation (MT), that plays a crucial role for information accessibility and communicat

                                            • Trados Machine Translation - Google 検索

                                              機械翻訳(MT)とは、コンピュータがある言語を別の言語に変換することです。人間は関与しません。 1950年代に開拓されたMTは、初期の人工知能の例の1つです。

                                              • Recent advances in low-resource machine translation

                                                Machine translation (MT) is one of the most successful applications of natural language processing (NLP) today, with systems surpassing human-level performance in some language translation tasks. These advances, however, rely on the availability of large-scale parallel corpora, or collections of sentences in both the source language and corresponding translations in the target language. Current hi

                                                  Recent advances in low-resource machine translation
                                                • シストラン社、翻訳精度調査「The Intento 2021 State of Machine Translation Report」において最も翻訳精度が高い評価を獲得

                                                  シストラン社、翻訳精度調査「The Intento 2021 State of Machine Translation Report」において最も翻訳精度が高い評価を獲得 AI翻訳の先駆者として50年以上の歴史を持つシストランジャパン合同会社(本社:東京都港区、日本代表:江上聡、以下「シストラン社」)は本日、Intento社とTAUS社が合同で実施した翻訳精度評価「The Intento 2021 State of Machine Translation Report」において、シストラン社のAI翻訳エンジンが、最も翻訳精度の高いエンジンとして選ばれたことを発表しました。さらに、今回の評価では日英のみならずシストラン社が最多の6言語ペアで最高評価を得ました。 選ばれた要因として、他社に比べて長い文章の翻訳を、翻訳精度が高いまま可能にした点が挙げられます。一般的に、人間が翻訳する場合でも、長

                                                    シストラン社、翻訳精度調査「The Intento 2021 State of Machine Translation Report」において最も翻訳精度が高い評価を獲得
                                                  1