ODBIERZ TWÓJ BONUS :: »

    Caffe2 Quick Start Guide. Modular and scalable deep learning made easy

    (ebook) (audiobook) (audiobook) Język publikacji: angielski
    Caffe2 Quick Start Guide. Modular and scalable deep learning made easy Ashwin Nanjappa - okładka ebooka

    Caffe2 Quick Start Guide. Modular and scalable deep learning made easy Ashwin Nanjappa - okładka ebooka

    Caffe2 Quick Start Guide. Modular and scalable deep learning made easy Ashwin Nanjappa - okładka audiobooka MP3

    Caffe2 Quick Start Guide. Modular and scalable deep learning made easy Ashwin Nanjappa - okładka audiobooks CD

    Ocena:
    Bądź pierwszym, który oceni tę książkę
    Stron:
    136
    Dostępne formaty:
    PDF
    ePub
    Mobi

    Ebook

    79,90 zł

    Dodaj do koszyka lub Kup na prezent
    Kup 1-kliknięciem

    Przenieś na półkę

    Do przechowalni

    Caffe2 is a popular deep learning library used for fast and scalable training, and inference of deep learning models on different platforms. This book introduces you to the Caffe2 framework and demonstrates how you can leverage its power to build, train, and deploy efficient neural network models at scale.
    The Caffe 2 Quick Start Guide will help you in installing Caffe2, composing networks using its operators, training models, and deploying models to different architectures. The book will also guide you on how to import models from Caffe and other frameworks using the ONNX interchange format. You will then cover deep learning accelerators such as CPU and GPU and learn how to deploy Caffe2 models for inference on accelerators using inference engines. Finally, you'll understand how to deploy Caffe2 to a diverse set of hardware, using containers on the cloud and resource-constrained hardware such as Raspberry Pi.
    By the end of this book, you will not only be able to compose and train popular neural network models with Caffe2, but also deploy them on accelerators, to the cloud and on resource-constrained platforms such as mobile and embedded hardware.

    Wybrane bestsellery

    O autorze ebooka

    Ashwin Nanjappa is a senior architect at NVIDIA, working in the TensorRT team on improving deep learning inference on GPU accelerators. He has a PhD from the National University of Singapore in developing GPU algorithms for the fundamental computational geometry problem of 3D Delaunay triangulation. As a post-doctoral research fellow at the BioInformatics Institute (Singapore), he developed GPU-accelerated machine learning algorithms for pose estimation using depth cameras. As an algorithms research engineer at Visenze (Singapore), he implemented computer vision algorithm pipelines in C++, developed a training framework built upon Caffe in Python, and trained deep learning models for some of the world's most popular online shopping portals.

    Zamknij

    Wybierz metodę płatności

    Zamknij Pobierz aplikację mobilną Ebookpoint