Jump to ratings and reviews
Rate this book

Deep Learning with JAX

Rate this book
Accelerate deep learning and other number-intensive tasks with JAX, Google’s awesome high-performance numerical computing library.

The JAX numerical computing library tackles the core performance challenges at the heart of deep learning and other scientific computing tasks. By combining Google’s Accelerated Linear Algebra platform (XLA) with a hyper-optimized version of NumPy and a variety of other high-performance features, JAX delivers a huge performance boost in low-level computations and transformations.

In Deep Learning with JAX you will learn how

• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax
• Leverage libraries and modules from the JAX ecosystem

Deep Learning with JAX is a hands-on guide to using JAX for deep learning and other mathematically-intensive applications. Google Developer Expert Grigory Sapunov steadily builds your understanding of JAX’s concepts. The engaging examples introduce the fundamental concepts on which JAX relies and then show you how to apply them to real-world tasks. You’ll learn how to use JAX’s ecosystem of high-level libraries and modules, and also how to combine TensorFlow and PyTorch with JAX for data loading and deployment.

About the technology

Google’s JAX offers a fresh vision for deep learning. This powerful library gives you fine control over low level processes like gradient calculations, delivering fast and efficient model training and inference, especially on large datasets. JAX has transformed how research scientists approach deep learning. Now boasting a robust ecosystem of tools and libraries, JAX makes evolutionary computations, federated learning, and other performance-sensitive tasks approachable for all types of applications.

About the book

Deep Learning with JAX teaches you to build effective neural networks with JAX. In this example-rich book, you’ll discover how JAX’s unique features help you tackle important deep learning performance challenges, like distributing computations across a cluster of TPUs. You’ll put the library into action as you create an image classification tool, an image filter application, and other realistic projects. The nicely-annotated code listings demonstrate how JAX’s functional programming mindset improves composability and parallelization.

What's inside

• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax

About the reader

For intermediate Python programmers who are familiar with deep learning.

About the author

Grigory Sapunov holds a Ph.D. in artificial intelligence and is a Google Developer Expert in Machine Learning.

The technical editor on this book was Nicholas McGreivy.

Table of Contents
Part 1
1 When and why to use JAX
2 Your first program in JAX
Part 2
3 Working with arrays
4 Calculating gradients
5 Compiling your code
6 Vectorizing your cod

408 pages, Paperback

Published October 29, 2024

5 people are currently reading
11 people want to read

About the author

Grigory Sapunov

1 book1 follower

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
6 (85%)
4 stars
0 (0%)
3 stars
1 (14%)
2 stars
0 (0%)
1 star
0 (0%)
Displaying 1 - 3 of 3 reviews
3 reviews1 follower
October 30, 2024
I was lucky enough to have reviewed this book and I thought I was familiar with important tools for deep learning but I learned I was wrong, as JAX is a very important library if you are doing large scale numerical computing projects, not just machine learning.

What I found fascinating is that it isn't a new version of PyTorch or tensorflow, but is designed to do one thing very well and that is to accelerate the training.

So the author explains how you use loaders from these other libraries, but then when you are ready to do the training, you can set up your pipeline and what was fascinating is that you can have it compile and parallelize functions. This can be for much more than just deep learning, as any scientific project that requires a large amount of number crunching can benefit.

They give references to understand more about the math behind what is going on, if you want to get much deeper, but the explanations they give are very helpful.

This is a tool though that requires functional programming, but if you are going to be doing this type of work then I think pure functional programming is the best approach anyway.

I haven't tried to see how having it differentiate any function can be useful, but it has capability for that, and when you are choosing how to parallelize it has not only different approaches but you can control more of the multiple gpus or tpus before you kick off the processing.

This is a heavy book, in that the math can be challenging for people that don't have a strong background in calculus. This is where my EE background helped as I could follow what was being discussed, and I did read some of the references, but there are so many papers that you can go to that it makes it daunting, but also allows you to pick how deep you want to go.

This book isn't for every project but if you are working on a simulation, or I expect, it will also help when fine-tuning LLMs, as at the end of the day that is just numerical processing, and for these use cases you should check out this book and quickly get up to speed on what JAX can do to help speed up your calculations.
3 reviews
January 9, 2025
I liked Grigory's concise writing style. In a matter of a few chapters, he clearly explains some truly complex concepts. This book is the quickest and most self-contained way to learn JAX, and I wish books written in the same style existed for Torch and Tensorflow as well.
Displaying 1 - 3 of 3 reviews

Can't find what you're looking for?

Get help and learn more about the design.