並び順

ブックマーク数

期間指定

  • から
  • まで

1 - 40 件 / 46件

新着順 人気順

compressionの検索結果1 - 40 件 / 46件

タグ検索の該当結果が少ないため、タイトル検索結果を表示しています。

compressionに関するエントリは46件あります。 githubimage画像 などが関連タグです。 人気エントリには 『Lyra: A New Very Low-Bitrate Codec for Speech Compression』などがあります。
  • Lyra: A New Very Low-Bitrate Codec for Speech Compression

    Philosophy We strive to create an environment conducive to many different types of research across many different time scales and levels of risk. Learn more about our Philosophy Learn more

      Lyra: A New Very Low-Bitrate Codec for Speech Compression
    • アップロードした画像を劣化させずに圧縮・最適化してくれるWordPressプラグイン「Smush Image Compression and Optimization」

      Smush Image Compression and Optimizationは、アップロードした画像を自動的に圧縮・最適化することができるWordPressプラグインです。 画像を劣化させずにファイルサイズを削減してくれます。 Smush Image Compression and Optimizationのインストール インストール手順は以下の通りです。 Smush Image Compression and Optimizationをダウンロードします。ダウンロードしたファイルを展開し wp-content/plugins にアップロードします。管理画面の[プラグイン]ページで、Smushを有効化します。 Smush Image Compression and Optimizationの設定 Smush Image Compression and Optimizationを有効化す

        アップロードした画像を劣化させずに圧縮・最適化してくれるWordPressプラグイン「Smush Image Compression and Optimization」
      • langchainのアップデートで追加された"Contextual Compression Retriever"を使用して,200ページ超のwebページを読みこませたQ&Aボットを作成する - Qiita

        langchainのアップデートで追加された"Contextual Compression Retriever"を使用して,200ページ超のwebページを読みこませたQ&Aボットを作成するOpenFOAMOpenAIChatGPTlangchain記事投稿キャンペーン_ChatGPT TL;DR OpenFOAMというCFDライブラリのユーザーガイド(総209ページ)をopenai embeddingsを用いて埋め込み生成した。 生成した埋め込み生成をContextualCompressionRetrieverで圧縮した。 圧縮した埋め込みをRetrievalQAで指定し、Q&Aボットを作成した。 結果として、破綻なく回答できている印象 使用したホームページ 今回はCFDライブラリのOpenFOAMのユーザーガイドを使用しました。(https://doc.cfd.direct/openfo

          langchainのアップデートで追加された"Contextual Compression Retriever"を使用して,200ページ超のwebページを読みこませたQ&Aボットを作成する - Qiita
        • Compression Dictionary Transport (Shared Brotli) によるコンテンツ圧縮の最適化 | blog.jxck.io

          Intro Chrome で Compression Dictionary Transport の Experiment が行われている。 Intent to Experiment: Compression dictionary transport with Shared Brotli https://groups.google.com/a/chromium.org/g/blink-dev/c/NgH-BeYO72E この提案の仕様および本サイトへの適用について解説する。 brotli の Dictionary 圧縮方式は、基本的に「同じ値が出てきたら、それらをまとめて小さく表現する」という方式が中心となる。 # 繰り返しを数値で表現する場合 from: aaaabbbbb to: a4b5 この方式は、対象としたデータの中で、如何に効率よく「同じ値」を見つけるかが肝となる。例えば以下の例

            Compression Dictionary Transport (Shared Brotli) によるコンテンツ圧縮の最適化 | blog.jxck.io
          • GitHub - jtesta/ssh-audit: SSH server & client security auditing (banner, key exchange, encryption, mac, compression, compatibility, security, etc)

            SSH1 and SSH2 protocol server support; analyze SSH client configuration; grab banner, recognize device or software and operating system, detect compression; gather key-exchange, host-key, encryption and message authentication code algorithms; output algorithm information (available since, removed/disabled, unsafe/weak/legacy, etc); output algorithm recommendations (append or remove based on recogn

              GitHub - jtesta/ssh-audit: SSH server & client security auditing (banner, key exchange, encryption, mac, compression, compatibility, security, etc)
            • How to Use AVIF: The New Next-Gen Image Compression Format — Lightspeed

              How to Use AVIF: The New Next-Gen Image Compression FormatAugust 5, 2020 November 2, 2021 Update: Firefox 93 now supports the AVIF format without feature flag. August 26, 2020 Update: Chrome 85 now supports the AVIF format and the link to the preview build of the Squoosh.app has been updated as it now fully supports AVIF. A More Optimal Image Format One of the upcoming technologies we're really ex

                How to Use AVIF: The New Next-Gen Image Compression Format — Lightspeed
              • GitHub - caoscott/SReC: PyTorch Implementation of "Lossless Image Compression through Super-Resolution"

                A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

                  GitHub - caoscott/SReC: PyTorch Implementation of "Lossless Image Compression through Super-Resolution"
                • How to recognize the compression algorithms with your eyes - ZenHAX

                  online casino malaysia online casino singapore[phpBB Debug] PHP Warning: in file [ROOT]/phpbb/session.php on line 1065: Cannot modify header information - headers already sent by (output started at [ROOT]/vendor/autoload.php:3) [phpBB Debug] PHP Warning: in file [ROOT]/phpbb/session.php on line 1065: Cannot modify header information - headers already sent by (output started at [ROOT]/vendor/autolo

                  • GitHub - mhx/dwarfs: A fast high compression read-only file system for Linux, Windows and macOS

                    DwarFS is a read-only file system with a focus on achieving very high compression ratios in particular for very redundant data. This probably doesn't sound very exciting, because if it's redundant, it should compress well. However, I found that other read-only, compressed file systems don't do a very good job at making use of this redundancy. See here for a comparison with other compressed file sy

                      GitHub - mhx/dwarfs: A fast high compression read-only file system for Linux, Windows and macOS
                    • GitHub - phoboslab/qoi: The “Quite OK Image Format” for fast, lossless image compression

                      You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                        GitHub - phoboslab/qoi: The “Quite OK Image Format” for fast, lossless image compression
                      • GitHub - google/lyra: A Very Low-Bitrate Codec for Speech Compression

                        The basic architecture of the Lyra codec is quite simple. Features are extracted from speech every 20ms and are then compressed for transmission at a desired bitrate between 3.2kbps and 9.2kbps. On the other end, a generative model uses those features to recreate the speech signal. Lyra harnesses the power of new natural-sounding generative models to maintain the low bitrate of parametric codecs w

                          GitHub - google/lyra: A Very Low-Bitrate Codec for Speech Compression
                        • Pointer Compression in V8 · V8

                          Show navigation There is a constant battle between memory and performance. As users, we would like things to be fast as well as consume as little memory as possible. Unfortunately, usually improving performance comes at a cost of memory consumption (and vice versa). Back in 2014 Chrome switched from being a 32-bit process to a 64-bit process. This gave Chrome better security, stability and perform

                          • Arch Linux - News: Now using Zstandard instead of xz for package compression

                            As announced on the mailing list, on Friday, Dec 27 2019, our package compression scheme has changed from xz (.pkg.tar.xz) to zstd (.pkg.tar.zst). zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup. We a

                            • Sorted integers compression with Elias-Fano encoding

                              January 6, 2018 compression elias-fano data structures In the previous post we discovered how to compress a set of integers by representing it as a bitmap and then compressing the latter using a succinct representation. This post instead is about compression of monotone non-decreasing integers lists by using Elias-Fano encoding. It may sound like a niche algorithm, something that solves such an in

                                Sorted integers compression with Elias-Fano encoding
                              • Pointer compression in Oilpan · V8

                                Show navigation It is absolutely idiotic to have 64-bit pointers when I compile a program that uses less than 4 gigabytes of RAM. When such pointer values appear inside a struct, they not only waste half the memory, they effectively throw away half of the cache. – Donald Knuth (2008) Truer words have (almost) never been spoken. We also see CPU vendors not actually shipping 64-bit CPUs and Android

                                • Stable Diffusion Based Image Compression

                                  IntroStable Diffusion is currently inspiring the open source machine learning community as well as upsetting artists world wide. I was curious to see what else this impactful technology release could be used for other than making professional artists and designers ponder their job security. While experimenting with the model, I found that it makes for an extremely powerful lossy image compression

                                    Stable Diffusion Based Image Compression
                                  • Reported Supply Chain Compromise Affecting XZ Utils Data Compression Library, CVE-2024-3094 | CISA

                                    Official websites use .gov A .gov website belongs to an official government organization in the United States. Secure .gov websites use HTTPS A lock (A locked padlock) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

                                    • LangChainの新機能Contextual Compression Retrieverを試す|mah_lab / 西見 公宏

                                      Q&Aチャットボットのようなシステムを作成するとき、ユーザーの問い合わせに関連した情報をプロンプトに埋め込んで精度の高い回答を返す、といった仕組みはもはや一般的だと思います。 https://blog.langchain.dev/improving-document-retrieval-with-contextual-compression/その上で、関連情報を取り出す仕組みとしてベクトルDBの利用が一般的になってきていますが、抽出した文章が必ずしも質問に対して適切な情報源になっているとは限らない可能性はあります。類似度から算出して似ていると評価されていても、文脈が違うケースもあったりするのではないでしょうか。 先日(4/21)追加されたContextual Compression Retrieverはまさにこの問題を解決するためのもので、ベクトルDBなどから抽出した情報の評価を行い、更に

                                        LangChainの新機能Contextual Compression Retrieverを試す|mah_lab / 西見 公宏
                                      • Lyra: A New Very Low-Bitrate Codec for Speech Compression

                                        Philosophy We strive to create an environment conducive to many different types of research across many different time scales and levels of risk. Learn more about our Philosophy Learn more

                                          Lyra: A New Very Low-Bitrate Codec for Speech Compression
                                        • GitHub - phiresky/sqlite-zstd: Transparent dictionary-based row-level compression for SQLite

                                          The data will be moved to _table_name_zstd, while table_name will be a view that can be queried as normally, including SELECT, INSERT, UPDATE, and DELETE queries. This function will not compress any data by itself, you need to call zstd_incremental_maintenance afterwards. config is a json object describing the configuration. See TransparentCompressConfig for detail. The following differences apply

                                            GitHub - phiresky/sqlite-zstd: Transparent dictionary-based row-level compression for SQLite
                                          • GitHub - 101arrowz/fflate: High performance (de)compression in an 8kB package

                                            fflate (short for fast flate) is the fastest, smallest, and most versatile pure JavaScript compression and decompression library in existence, handily beating pako, tiny-inflate, and UZIP.js in performance benchmarks while being multiple times more lightweight. Its compression ratios are often better than even the original Zlib C library. It includes support for DEFLATE, GZIP, and Zlib data. Data

                                              GitHub - 101arrowz/fflate: High performance (de)compression in an 8kB package
                                            • GitHub - eashish93/imgsquash: Simple image compression full website code written in node, react and next.js framework. Easy to deploy as a microservice.

                                              You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session.

                                                GitHub - eashish93/imgsquash: Simple image compression full website code written in node, react and next.js framework. Easy to deploy as a microservice.
                                              • GitHub - TheRealOrange/icer_compression: Progressive, error tolerant, wavelet-based image compression algorithm

                                                You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                  GitHub - TheRealOrange/icer_compression: Progressive, error tolerant, wavelet-based image compression algorithm
                                                • How to Improve Rails Caching with Brotli Compression

                                                  Caching is an effective way to speed up the performance of Rails applications. However, the costs of an in-memory cache database could become significant for larger-scale projects. In this blog post, I’ll describe optimizing the Rails caching mechanism using the Brotli compression algorithm instead of the default Gzip. I’ll also discuss a more advanced technique of using in-memory cache for extrem

                                                    How to Improve Rails Caching with Brotli Compression
                                                  • New for Firebase Hosting: request logging, Brotli compression, and internationalization

                                                    New for Firebase Hosting: request logging, Brotli compression, and internationalization We’re excited to announce several new features that make developing with Firebase Hosting even better! New features! Server-side analytics with Cloud Logging Our new integration with Cloud Logging gives you access to web request logs for your Hosting sites. Cloud Logging, previously known as Stackdriver Logging

                                                      New for Firebase Hosting: request logging, Brotli compression, and internationalization
                                                    • RFC8879 TLS Certificate Compression について - ASnoKaze blog

                                                      「RFC8879 TLS Certificate Compression」が昨日公開されました。 これは、TLSハンドシェイク中に送信されるサーバ証明書を圧縮する仕組みを定義しています。 これによって、ハンドシェイクの通信量を削減できます。ハンドシェイク中は、パケットロスが起こっても後続のパケットは多くないのでロスリカバリとしては不利な状況です。ハンドシェイクに必要なパケット数が減るというのはメリットが有るのかなと思います。 また、QUIC(HTTP/3)においてもTLSハンドシェイクを利用していますが、QUICではClient Address Validationが終わるまで一度に送れる通信量に制限があるため、通信量が減ることはそういった意味でもメリットがあります。 この点については、FastlyのPatrick McManus氏がブログを書かれています。実際に送信されるパケット数がどう

                                                        RFC8879 TLS Certificate Compression について - ASnoKaze blog
                                                      • ASUS、Display Stream Compression対応の43型ゲーミングディスプレイを発表

                                                          ASUS、Display Stream Compression対応の43型ゲーミングディスプレイを発表
                                                        • Browser Image Compressionで画像をアップロード前に圧縮する[Javascript] | バシャログ。

                                                          どうもfujiharaです。本日は画像をアップロード前に圧縮できる BrowserImageCompressionをご紹介します。 インストール npm install browser-image-compression --save or yarn add browser-image-compression コード 以下が簡単な確認用コードになります。(react-create-appで作成) import './App.css'; import { useState } from 'react'; import imageCompression from 'browser-image-compression'; function App() { const [image_url, setImageUrl] = useState(''); const compressOption = {

                                                            Browser Image Compressionで画像をアップロード前に圧縮する[Javascript] | バシャログ。
                                                          • LangChainのContextual Compressionがどのようにコンテキストを圧縮しているのか

                                                            LangChainにContextual Compressionという抽象化が追加されました。概要は以下にあります。 Contextual Compressionは「インデックスするドキュメントのテキスト」と「プロンプトに含めるコンテキストとしてのテキスト」の性質が異る点に注目して、ドキュメント検索の後処理としてプロンプトに含めるテキストの内容に変換処理をかけて改善します。 前提知識 「LLMに質問の答えを生成してもらうためにコンテキストとして事前に検索したテキストをプロンプトに挿入する」という大枠の仕組みさえ知っていればokです。 最近読んだ以下のスライドが分かりやすかったです。 使うRetriever ドキュメント取得のRetrieverにはChatGPT Retriever PluginsをLangChainでデバッグするで作ったRetriverを利用します。 「最近話題になった英語

                                                              LangChainのContextual Compressionがどのようにコンテキストを圧縮しているのか
                                                            • GitHub - richardartoul/tsdb-layer: Time Series and FoundationDB. Millions of writes/s and 10x compression in under 2,000 lines of Go.

                                                              You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                                GitHub - richardartoul/tsdb-layer: Time Series and FoundationDB. Millions of writes/s and 10x compression in under 2,000 lines of Go.
                                                              • Data Compression for Large-Scale Streaming Experimentation

                                                                Julie (Novak) Beckley, Andy Rhines, Jeffrey Wong, Matthew Wardrop, Toby Mao, Martin Tingley Ever wonder why Netflix works so well when you’re streaming at home, on the train, or in a foreign hotel? Behind the scenes, Netflix engineers are constantly striving to improve the quality of your streaming service. The goal is to bring you joy by delivering the content you love quickly and reliably every

                                                                  Data Compression for Large-Scale Streaming Experimentation
                                                                • GitHub - microsoft/LLMLingua: To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.

                                                                  You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                                    GitHub - microsoft/LLMLingua: To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.
                                                                  • Compression of PostgreSQL WAL Archives Becoming More Important

                                                                    As hardware and software evolve, the bottlenecks in a database system also shift. Many old problems might disappear and new types of problems pop-up. Old LimitationsThere were days when CPU and Memory was a limitation. More than a decade back, servers with 4 cores were “High End” and as a DBA, my biggest worry was managing the available resources. And for an old DBA like me, Oracle’s attempt to po

                                                                      Compression of PostgreSQL WAL Archives Becoming More Important
                                                                    • Depth image compression by colorization for Intel® RealSense™ Depth Cameras

                                                                      Depth image compression by colorization for Intel® RealSense™ Depth Cameras 📘This article is also available in PDF format Fig. 1 Left: Original depth image of D435. Center: Colorized depth image with JPG compression. Right: Recovered depth image from colorized and JPG compressed depth Compression of RGB images is today a relatively mature field with many impressive codecs available to choose from

                                                                      • Compression and decompression in the browser with the Compression Streams API  |  Blog  |  Chrome for Developers

                                                                        Build with Chrome Learn how Chrome works, participate in origin trials, and build with Chrome everywhere.

                                                                        • GitHub - Donaldcwl/browser-image-compression: Image compression in web browser

                                                                          You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session. Dismiss alert

                                                                            GitHub - Donaldcwl/browser-image-compression: Image compression in web browser
                                                                          • Compression Streams are now supported on all browsers  |  Blog  |  web.dev

                                                                            Compression Streams are now supported on all browsers Stay organized with collections Save and categorize content based on your preferences. The Compression Streams API is for compressing and decompressing streams of data using the gzip or deflate (or deflate-raw) formats. Using the built-in compression of the Compression Streams API, JavaScript applications do not need to include a compression li

                                                                              Compression Streams are now supported on all browsers  |  Blog  |  web.dev
                                                                            • LLMZip: Lossless Text Compression using Large Language Models

                                                                              We provide new estimates of an asymptotic upper bound on the entropy of English using the large language model LLaMA-7B as a predictor for the next token given a window of past tokens. This estimate is significantly smaller than currently available estimates in \cite{cover1978convergent}, \cite{lutati2023focus}. A natural byproduct is an algorithm for lossless compression of English text which com

                                                                              • GitHub - WICG/compression-dictionary-transport

                                                                                This proposal adds support for using designated previous responses as an external dictionary for HTTP responses for compression schemes that support external dictionaries (e.g. Brotli and Zstandard). HTTP Content-Encoding is extended with new encoding types and support for allowing responses to be used as dictionaries for future requests. All actual header values and names still TBD: Server respon

                                                                                  GitHub - WICG/compression-dictionary-transport
                                                                                • MySQL 8.0.20 で導入された binlog transaction compression をハイパー雑に検証する - それが僕には楽しかったんです。

                                                                                  はじめに 前提環境と検証方法 リリースノートの文献 やってみた perfomance_shema をみてみる 余談 さいごに 2020/04/29 4:42 2020/04/29 11:43 2020/04/29 12:08 はじめに どうも、共通新人研修がビジネス職よりで割とコードとか書いている暇がなかったけんつです。 今週からはエンジニアの研修で RFC と格闘することが強制されて息を吹き返してます。 MySQL 8.0.20 がリリースされて、めちゃくちゃ気になったのが binlog 圧縮。 他にも検証をしていたのだけど、なんか急に気になったので全てを放り投げて調べてみたくなった。 zstd というアルゴリズムを用いて binlog を可逆圧縮する機能が追加になったので、どれだけ圧縮されるのか調べてみた。 MySQL の運用に関わったことがなく、完全に趣味で追っているので binlo

                                                                                    MySQL 8.0.20 で導入された binlog transaction compression をハイパー雑に検証する - それが僕には楽しかったんです。

                                                                                  新着記事