AI and ML for Coders in PyTorch A Coders Guide to Generative AI and Machine Learning (Laurence Moroney) (Z-Library)

Author: Laurence Moroney

科学

Eager to learn AI and machine learning but unsure where to start? Laurence Moroney's hands-on, code-first guide demystifies complex AI concepts without relying on advanced mathematics. Designed for programmers, it focuses on practical applications using PyTorch, helping you build real-world models without feeling overwhelmed. From computer vision and natural language processing (NLP) to generative AI with Hugging Face Transformers, this book equips you with the skills most in demand for AI development today. You'll also learn how to deploy your models across the web and cloud confidently. Gain the confidence to apply AI without needing advanced math or theory expertise Discover how to build AI models for computer vision, NLP, and sequence modeling with PyTorch Learn generative AI techniques with Hugging Face Diffusers and Transformers

📄 File Format: PDF
💾 File Size: 35.4 MB
42
Views
0
Downloads
0.00
Total Donations

📄 Text Preview (First 20 pages)

ℹ️

Registered users can read the full content for free

Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.

📄 Page 1
Laurence Moroney Foreword by Andrew Ng AI and ML for Coders in PyTorch A Coder’s Guide to Generative AI and Machine Learning
📄 Page 2
ISBN: 978-1-098-19917-3 US $79.99 CAN $99.99 DATA Eager to learn AI and machine learning but unsure where to start? Laurence Moroney’s hands-on, code-first guide demystifies complex AI concepts without relying on advanced mathematics. Designed for programmers, it focuses on practical applications using PyTorch, helping you build real-world models without feeling overwhelmed. From computer vision and natural language processing (NLP) to generative AI with Hugging Face Transformers, this book equips you with the skills most in demand for AI development today. You’ll also learn how to confidently deploy your models across the web and cloud. • Gain the confidence to apply AI without needing advanced math or theory expertise • Discover how to train AI models for computer vision, NLP, and more • Learn generative AI techniques with Transformers and Diffusers • Learn how to prompt-tune, fine-tune, and create LoRA for your generative models AI and ML for Coders in PyTorch “A perfect hands-on guide for developers who want to actually build with AI, not just read about it. This is the book I wish I had when I started.” Dominic Monn, CEO, MentorCruise “This is a book you won’t want to miss if you want to become a full stack AI practitioner.” Dr. Pin-Yu Chen, principal research scientist, IBM Research “Laurence Moroney has been a major force in helping developers succeed with AI in TensorFlow and PyTorch. I have been privileged to work together with him in teaching several specializations with DeepLearning.AI and Coursera, including an upcoming one on PyTorch. With Laurence as a teacher, great adventures await you.” Andrew Ng, founder, DeepLearning.AI Laurence Moroney is an award-winning AI researcher and best-selling author of over 20 books. A 30+-year veteran of the software and ML industries, he’s passionate about building the next generation of engineers.
📄 Page 3
Praise for AI and ML for Coders in PyTorch A perfect hands-on guide for developers who want to actually build with AI, not just read about it. Clear, practical, and grounded in PyTorch— this is the book I wish I had when I started. —Dominic Monn, CEO, MentorCruise.com This is a book you won’t want to miss if you want to become a full-stack AI practitioner. It provides a comprehensive overview and concrete examples for building a variety of AI models and applications from scratch. —Dr. Pin-Yu Chen, principal research scientist, IBM Research A must-read book for developers diving into AI/ML. You will learn generative AI through real-world coding examples in PyTorch. —Margaret Maynard-Reid, ML engineer at M Couture 3D Laurence has masterfully bridged the gap between theory and practice—AI and ML for Coders in PyTorch is not just a book, it’s a hands-on journey through modern machine learning, from basics to large language models, all with clarity and purpose. —Vishwesh Shrimali, AI engineer in the automotive industry Laurence has done it again, distilling complex AI concepts into an approachable, coder first masterclass. This PyTorch edition makes machine learning accessible to a broader and powerful community. —Laura Uzcátegui, cofounder of DynG AI, Inc.
📄 Page 4
This book is a fantastic piece for any developer or CS student looking to step into AI. Laurence pairs well-explained PyTorch code with clear foundational theory and visuals that truly boost comprehension. Covering vision, time-series, NLP, and more, it’s a practical, comprehensive guide that builds solid machine learning foundations. —Louis-François Bouchard, AI educator; cofounder & CTO, Towards AI Laurence’s trademark code-before-theory approach lowers the barrier for busy coders. For the millions who loved his bestselling AI and Machine Learning for Coders, this PyTorch sequel is your fast-track to Generative AI. Spanning LLM fine-tuning, RAG, and Stable Diffusion, this book is an essential upgrade for today’s AI wave. —Ammar Mohanna, AI consultant & lecturer Laurence has done a phenomenal job crafting a book that brings developers up to speed in the ever-evolving world of generative AI. A fantastic teacher, he breaks down complex concepts with clarity, making them accessible to learners at any level. The practical code examples throughout the book provide a solid foundation to spark creativity and empower developers to start building right away. —Roya Kandalan, PhD, Generative AI research scientist
📄 Page 5
Laurence Moroney AI and ML for Coders in PyTorch A Coder’s Guide to Generative AI and Machine Learning
📄 Page 6
978-1-098-19917-3 [LSI] AI and ML for Coders in PyTorch by Laurence Moroney Copyright © 2025 Laurence Moroney. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editor: Nicole Butterfield Development Editor: Jill Leonard Production Editor: Aleeya Rahman Copyeditor: Doug McNair Proofreader: Piper Content Partners Indexer: Sue Klefstad Cover Designer: Susan Brown Cover Illustrator: Monica Kamsvaag Interior Designer: David Futato Interior Illustrator: Kate Dullea July 2025: First Edition Revision History for the First Edition 2025-06-27: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781098199173 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. AI and ML for Coders in PyTorch, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the author and do not represent the publisher’s views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
📄 Page 7
Table of Contents Foreword. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv 1. Introduction to PyTorch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 What Is Machine Learning? 1 Limitations of Traditional Programming 3 From Programming to Learning 5 What Is PyTorch? 7 Using PyTorch 9 Installing Porch in Python 9 Using PyTorch in PyCharm 10 Using PyTorch in Google Colab 12 Getting Started with Machine Learning 14 Seeing What the Network Learned 21 Summary 21 2. Introduction to Computer Vision. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 How Computer Vision Works 23 The Fashion MNIST Database 24 Neurons for Vision 26 Designing the Neural Network 28 The Complete Code 30 Training the Neural Network 35 Exploring the Model Output 38 Overfitting 40 Early Stopping 41 Summary 43 v
📄 Page 8
3. Going Beyond the Basics: Detecting Features in Images. . . . . . . . . . . . . . . . . . . . . . . . . . 45 Convolutions 46 Pooling 48 Implementing Convolutional Neural Networks 49 Exploring the Convolutional Network 52 Building a CNN to Distinguish Between Horses and Humans 55 The “Horses or Humans” Dataset 55 Handling the Data 56 CNN Architecture for “Horses or Humans” 58 Adding Validation to the “Horses or Humans” Dataset 60 Testing “Horses or Humans” Images 63 Image Augmentation 65 Transfer Learning 69 Multiclass Classification 74 Dropout Regularization 77 Summary 80 4. Using Data with PyTorch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Getting Started with Datasets 82 Exploring the FashionMNIST Class 84 Generic Dataset Classes 84 ImageFolder 85 DatasetFolder 85 FakeData 86 Using Custom Splits 86 The ETL Process for Managing Data in Machine Learning 88 Optimizing the Load Phase 89 Using the DataLoader Class 91 Batching 91 Shuffling 91 Parallel Data Loading 91 Custom Data Sampling 92 Parallelizing ETL to Improve Training Performance 92 Summary 93 5. Introduction to Natural Language Processing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Encoding Language into Numbers 95 Getting Started with Tokenization 96 Turning Sentences into Sequences 100 Removing Stopwords and Cleaning Text 103 Stripping Out HTML Tags 103 Stripping Out Stopwords 103 vi | Table of Contents
📄 Page 9
Stripping Out Punctuation 104 Working with Real Data Sources 104 Getting Text Datasets 105 Getting Text from CSV Files 109 Getting Text from JSON Files 112 Summary 114 6. Making Sentiment Programmable by Using Embeddings. . . . . . . . . . . . . . . . . . . . . . . . 115 Establishing Meaning from Words 115 A Simple Example: Positives and Negatives 116 Going a Little Deeper: Vectors 117 Embeddings in PyTorch 118 Building a Sarcasm Detector by Using Embeddings 118 Reducing Overfitting in Language Models 121 Putting It All Together 137 Using the Model to Classify a Sentence 139 Visualizing the Embeddings 141 Using Pretrained Embeddings 143 Summary 146 7. Recurrent Neural Networks for Natural Language Processing. . . . . . . . . . . . . . . . . . . . 147 The Basis of Recurrence 147 Extending Recurrence for Language 150 Creating a Text Classifier with RNNs 153 Stacking LSTMs 156 Using Pretrained Embeddings with RNNs 166 Summary 172 8. Using ML to Create Text. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 Turning Sequences into Input Sequences 174 Creating the Model 180 Generating Text 182 Predicting the Next Word 182 Compounding Predictions to Generate Text 184 Extending the Dataset 187 Improving the Model Architecture 188 Embedding Dimensions 189 Initializing the LSTMs 189 Variable Learning Rate 190 Improving the Data 191 Character-Based Encoding 193 Summary 195 Table of Contents | vii
📄 Page 10
9. Understanding Sequence and Time Series Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197 Common Attributes of Time Series 199 Trend 199 Seasonality 199 Autocorrelation 200 Noise 201 Techniques for Predicting Time Series 202 Naive Prediction to Create a Baseline 202 Measuring Prediction Accuracy 204 Less Naive Predictions: Using a Moving Average for Prediction 205 Improving the Moving-Average Analysis 206 Summary 207 10. Creating ML Models to Predict Sequences. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209 Creating a Windowed Dataset 211 Creating a Windowed Version of the Time Series Dataset 214 Creating and Training a DNN to Fit the Sequence Data 217 Evaluating the Results of the DNN 219 Tuning the Learning Rate 222 Summary 222 11. Using Convolutional and Recurrent Methods for Sequence Models. . . . . . . . . . . . . . . . 223 Convolutions for Sequence Data 223 Coding Convolutions 224 Experimenting with the Conv1D Hyperparameters 229 Using NASA Weather Data 232 Reading GISS Data in Python 234 Using RNNs for Sequence Modeling 236 Exploring a Larger Dataset 239 Using Other Recurrent Methods 242 Using Dropout 243 Using Bidirectional RNNs 246 Summary 247 12. Concepts of Inference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249 Tensors 249 Image Data 250 Text Data 253 Tensors Out of a Model 254 Summary 256 viii | Table of Contents
📄 Page 11
13. Hosting PyTorch Models for Serving. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257 Introducing TorchServe 258 Setting Up TorchServe 260 Preparing Your Environment 260 Setting Up Your config.properties File 260 Defining Your Model 261 Creating the Handler File 262 Creating the Model Archive 264 Starting the Server 266 Testing Inference 267 Going Further 269 Serving with Flask 270 Creating an Environment for Flask 270 Creating a Flask Server in Python 270 Summary 272 14. Using Third-Party Models and Hubs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273 The Hugging Face Hub 274 Using Hugging Face Hub 275 Using a Model From Hugging Face Hub 280 PyTorch Hub 282 Using PyTorch Vision Models 283 Natural Language Processing 285 Other Models 286 Summary 286 15. Transformers and transformers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289 Understanding Transformers 290 Encoder Architectures 290 The Decoder Architecture 297 The Encoder-Decoder Architecture 303 The transformers API 306 Getting Started with transformers 307 Core Concepts 308 Pipelines 308 Tokenizers 310 Summary 314 16. Using LLMs with Custom Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 Fine-Tuning an LLM 317 Setup and Dependencies 318 Loading and Examining the Data 319 Table of Contents | ix
📄 Page 12
Initializing the Model and Tokenizer 319 Preprocessing the Data 320 Collating the Data 320 Defining Metrics 321 Configuring Training 321 Initializing the Trainer 322 Training and Evaluation 323 Saving and Testing the Model 324 Prompt-Tuning an LLM 325 Preparing the Data 326 Creating the Data Loaders 327 Defining the Model 327 Training the Model 331 Evaluation During Training 332 Reporting Training Metrics 333 Saving the Prompt Embeddings 334 Performing Inference with the Model 334 Summary 337 17. Serving LLMs with Ollama. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339 Getting Started with Ollama 340 Running Ollama as a Server 343 Building an App that Uses an Ollama LLM 345 The Scenario 346 Building a Python Proof-of-Concept 347 Creating a Web App for Ollama 349 The app.js File 351 The Index.html File 353 Summary 354 18. Introduction to RAG. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357 What Is RAG? 359 Getting Started with RAG 360 Understanding Similarity 361 Creating the Database 362 Performing a Similarity Search 365 Putting It All Together 365 Using RAG Content with an LLM 366 Extending to Hosted Models 370 Summary 371 x | Table of Contents
📄 Page 13
19. Using Generative Models with Hugging Face Diffusers. . . . . . . . . . . . . . . . . . . . . . . . . . 373 What Are Diffusion Models? 373 Using Hugging Face Diffusers 376 Image-to-Image with Diffusers 379 Inpainting with Diffusers 382 Summary 385 20. Tuning Generative Image Models with LoRA and Diffusers. . . . . . . . . . . . . . . . . . . . . . . 387 Training a LoRA with Diffusers 388 Getting Diffusers 388 Getting Data for Fine-Tuning a LoRA 389 Fine-Tuning a Model with Diffusers 392 Publishing Your Model 395 Generating an Image with the Custom LoRA 397 Summary 401 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 Table of Contents | xi
📄 Page 14
(This page has no text content)
📄 Page 15
Foreword Dear Reader, AI is poised to transform every industry, but almost every AI application needs to be customized for its particular use. A system for reading medical records is different from one for finding defects in a factory, which is different from a product recom‐ mendation engine. For AI to reach its full potential, engineers need tools that can help them adapt to the amazing capabilities available to the millions of concrete prob‐ lems we wish to solve. When I led the Google Brain team, we started to build a C++ framework for deep learning called DistBelief. We were excited about the potential of harnessing thou‐ sands of CPUs to train a neural network (for instance, using 16,000 CPUs to train a cat detector on unlabeled YouTube videos). How far deep learning has come since then! What was once cutting-edge can now be done on your laptop! And you’ll learn how to, with PyTorch, in this book. Frameworks for creating ML models to imple‐ ment AI have come a long way, too. PyTorch has made significant progress. It is designed to be easy to learn as well as powerful enough to be used by researchers. Its rich features can help one build a wide range of AI tools, from simple models, to transfer learning from others, to fine tuning the most modern generative AI models. Today, millions of coders have become AI developers thanks to frameworks like PyTorch. Laurence Moroney, has been a major force in helping developers succeed with AI in TensorFlow and PyTorch. I have been privileged to work together with him in teach‐ ing several specializations with deeplearning.ai and Coursera, including an upcoming one on PyTorch. xiii
📄 Page 16
In our early days of working together, Laurence once Slacked me: Andrew sang a sad old song fainted through miss milliner invitation hoops fainted fainted [...] He had trained an LSTM on lyrics of traditional Irish songs and it generated these lines. But in this post-Transformer network age, we can create far more sophisticated content—from poetry, to code, to analysis of documents! In this book, he covers both and helps you be prepared for modern AI. And, if AI opens the door to fun like that, how could anyone not want to get involved? You can 1) work on exciting projects that move humanity forward, 2) advance your career, and 3) get free Irish poetry. I wish you the best in your journey learning PyTorch. With Laurence as a teacher, great adventures await you. Keep learning! — Andrew Ng Founder, DeepLearning.AI xiv | Foreword
📄 Page 17
Preface Welcome to AI and ML for Coders in PyTorch. My machine learning (ML) journey began many years ago with languages and frameworks like Lisp and Prolog. After that, my journey took me to Google, where I helped launch and grow TensorFlow. This experience informed my previous book, AI and Machine Learning for Coders. Since that book was published, whenever I met with the community to talk about AI, one question would come up: whether or not the questioner should invest their time in PyTorch. It was a strange question at first, but the more I heard it, the more I began to investigate. That line of thought got me to this point in my career, where PyTorch, once a rival to my work, is now something I passionately embrace. Why? Because it strikes a perfect balance between having the power and flexibility to let researchers or advanced engi‐ neers push the limits and also having the simplicity for any developer to pick it up and start their journey into ML. The goal of this book is to prepare you, as a coder, for just that—it’s accessible enough if you don’t fully understand ML yet, and also exposes you to the advanced concepts that will help you go deeper. The aim: to equip you to be an ML and AI developer without needing a PhD! I hope that you’ll find this book useful and that it will empower you with the confi‐ dence to get started on this wonderful and rewarding journey. Who Should Read This Book If you’re interested in AI and ML, and you want to get up and running quickly with building models that learn from data, this book is for you. If you’re interested in get‐ ting started with common AI and ML concepts—computer vision, natural language processing, sequence modeling, and more—and want to see how neural networks can be trained to solve problems in these spaces, I think you’ll enjoy this book. And if xv
📄 Page 18
you’ve heard all of the hoopla around generative AI, we roll our sleeves up and explore how that works with transformer and diffuser-based models. Most of all, if you’ve put off entering this valuable area of computer science because of perceived difficulty, in particular believing that you’ll need to dust off your old calcu‐ lus books, then fear not: this book takes a code-first approach that shows you just how easy it is to get started in the world of ML and artificial intelligence using PyTorch. Why I Wrote This Book I first got seriously involved with artificial intelligence in the spring of 1992. A freshly minted physics graduate living in London in the midst of a terrible recession, I had been unemployed for six months. The British government started a program to train 20 people in AI technology and put out a call for applicants. I was the first participant selected. Three months later, the program failed miserably, because while there was plenty of theoretical work that could be done with AI, there was no easy way to do it practically. One could write simple inference in a language called Prolog and perform list processing in a language called Lisp, but there was no clear path to deploying them in industry. The famous “AI winter” followed. Then, in 2016, while I was working at Google on a product called Firebase, the com‐ pany offered ML training to all engineers. I sat in a room with a number of other peo‐ ple and listened to lectures about calculus and gradient descent. I couldn’t quite match this to a practical implementation of ML, and I was suddenly transported back to 1992. I gave feedback about this, and about how we should be educating people in ML—teaching the code first to coders. Google embraced this philosophy, as did Meta with the release of PyTorch. In particular, both emphasized high-level APIs that made it easy for developers to get started, and I realized there was a need for a book that took advantage of this and widened access to ML so that it wasn’t just for mathematicians or PhDs anymore. I believe that more people using this technology and deploying it to end users will lead to an explosion in AI and ML that will prevent another AI winter and change the world very much for the better. I’m already seeing the impact of this, from the work done by Google on diabetic retinopathy, through Penn State University, to PlantVil‐ lage building an ML model for mobile that helps farmers diagnose cassava disease, to Médecins Sans Frontières using TensorFlow models to help diagnose antibiotic resist‐ ance, and much, much more! With the advent of generative AI, and the emergence of transformers and diffusers as libraries in their own right, the next great wave of AI is upon us. PyTorch is at the heart of all of that—so it was time for me to bring my work up-to-date and show just how easy it is for you to dip your toes in the waters of AI and ML development. xvi | Preface
📄 Page 19
With that in mind, welcome to this book on AI and ML for coders in PyTorch. I can’t wait to see what you build. Navigating This Book The book is written in two main parts. Part I (Chapters 1–11) talks about how to use PyTorch to build ML models for a variety of scenarios. It takes you from first princi‐ ples—building a model with a neural network containing only one neuron—through computer vision, natural language processing, and sequence modeling. Part II (Chap‐ ters 12–20) then walks you through generative AI scenarios—from understanding how transformers work in applications like ChatGPT through diffusers for image generation like Midjourney. Most chapters are standalone, so you can drop in and learn something new, or, of course, you can just read the book cover to cover. Technology You Need to Understand The goal of the first half of the book is to help you learn how to use PyTorch to build models with a variety of architectures. The only real prerequisite to this is under‐ standing Python, and in particular Python notation for data and array processing. You might also want to explore NumPy, a Python library for numeric calculations. If you have no familiarity with these, they are quite easy to learn, and you can probably pick up what you need as you go along (although some of the array notation might be a bit hard to grasp). Online Resources A variety of online resources are used by, and supported in, this book. At the very least, I would recommend that you keep an eye on O’Reilly’s website for books that complement this one and for any updates and breaking changes to technologies dis‐ cussed in the book. The code for this book is available on the book’s GitHub page, and I will keep it up to date there as the platform evolves. Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Constant width Used for program listings, as well as within paragraphs to refer to program ele‐ ments such as variable or function names, databases, data types, environment variables, statements, and keywords. Preface | xvii
📄 Page 20
Constant width bold Shows commands or other text that should be typed literally by the user. Constant width italic Shows text that should be replaced with user-supplied values or by values deter‐ mined by context. This element signifies a tip or suggestion. This element signifies a general note. Using Code Examples Supplemental material (code examples, exercises, etc.) is available for download at https://github.com/lmoroney/PyTorch-Book-FIles. If you have a technical question or a problem using the code examples, please send email to support@oreilly.com. This book is here to help you get your job done. In general, if example code is offered with this book, you may use it in your programs and documentation. You do not need to contact us for permission unless you’re reproducing a significant portion of the code. For example, writing a program that uses several chunks of code from this book does not require permission. Selling or distributing examples from O’Reilly books does require permission. Answering a question by citing this book and quoting example code does not require permission. Incorporating a significant amount of example code from this book into your product’s documentation does require permission. We appreciate, but generally do not require, attribution. An attribution usually includes the title, author, publisher, and ISBN. For example: “AI and ML for Coders with PyTorch, by Laurence Moroney. Copyright 2025 Laurence Moroney, 978-1-098-19917-3.” If you feel your use of code examples falls outside fair use or the permission given above, feel free to contact us at permissions@oreilly.com. xviii | Preface
The above is a preview of the first 20 pages. Register to read the complete e-book.

💝 Support Author

0.00
Total Amount (¥)
0
Donation Count

Login to support the author

Login Now
Back to List