Programming Machine Learning
From Coding to Deep Learning
by: Paolo Perrotta
Published | 2020-03-25 |
---|---|
Internal code | pplearn |
Print status | In Print |
Pages | 340 |
User level | Beginner |
Keywords | Machine Learning, Deep Learning, Neural Networks, Keras, Python, Artificial Intelligence, AI, pattern matching, |
Related titles |
|
ISBN | 9781680506600 |
Other ISBN |
Channel epub: 9781680507713 Channel PDF: 9781680507720 Kindle: 9781680507690 Safari: 9781680507706 Kindle: 9781680507690 |
BISACs | COM004000 COMPUTERS / Intelligence (AI) & SemanticsCOM051300 COMPUTERS / Programming / AlgorithmsCOM051300 COMPUTERS / Programming / Algorithms |
Highlight
You’ve decided to tackle machine learning — because you’re job hunting, embarking on a new project, or just think self-driving cars are cool. But where to start? It’s easy to be intimidated, even as a software developer. The good news is that it doesn’t have to be that hard. Conquer machine learning by writing code one line at a time, from simple learning programs all the way to a true deep learning system. Tackle the hard topics by breaking them down so they’re easier to understand, and build your confidence by getting your hands dirty.
Description
Peel away the obscurities of machine learning, starting from scratch and going all the way to deep learning. Machine learning can be intimidating, with its reliance on math and algorithms that most programmers don’t encounter in their regular work. Take a hands-on approach, writing the Python code yourself, without any libraries to obscure what’s really going on. Iterate on your design, and add layers of complexity as you go.
Build an image recognition application from scratch with supervised learning. Predict the future with linear regression. Dive into gradient descent, a fundamental algorithm that drives most of machine learning. Create perceptrons to classify data. Build neural networks to tackle more complex and sophisticated data sets. Train and refine those networks with backpropagation and batching. Layer the neural networks, eliminate overfitting, and add convolution to transform your neural network into a true deep learning system.
Start from the beginning and code your way to machine learning mastery.
Contents and Extracts
- <b>How the Heck Is That Possible?</b>
- From Zero to Image Recognition
- How Machine Learning Works
- <b>Programming vs. Machine Learning</b>
- Supervised Learning
- The Math Behind the Magic
- Setting Up Your System
- Your First Learning Program
- Getting to Know the Problem
- Coding Linear Regression
- Adding a Bias
- What You Just Learned
- Hands On: Tweaking the Learning Rate
- Walking the Gradient
- Our Algorithm Doesn’t Cut It
- Gradient Descent
- What You Just Learned
- Hands On: Basecamp Overshooting
- Hyperspace!
- Adding More Dimensions
- Matrix Math
- Upgrading the Learner
- Bye Bye, Bias
- A Final Test Drive
- What You Just Learned
- Hands On: Field Statistician
- A Discerning Machine
- Where Linear Regression Fails
- Invasion of the Sigmoids
- Classification in Action
- What You Just Learned
- Hands On: Weighty Decisions
- Getting Real <b>excerpt</b>
- Data Come First
- Our Own MNIST Library
- The Real Thing
- What You Just Learned
- Hands On: Tricky Digits
- The Final Challenge
- Going Multiclass
- Moment of Truth
- What You Just Learned
- Hands On: Minesweeper
- The Perceptron
- Enter the Perceptron
- Assembling Perceptrons
- Where Perceptrons Fail
- A Tale of Perceptrons
- How Machine Learning Works
- Neural Networks
- Designing the Network
- Assembling a Neural Network from Perceptrons
- Enter the Softmax
- Here’s the Plan
- What You Just Learned
- Hands On: Network Adventures
- Building the Network
- Coding Forward Propagation
- Cross Entropy
- What You Just Learned
- Hands On: Time Travel Testing
- Training the Network
- The Case for Backpropagation
- From the Chain Rule to Backpropagation
- Applying Backpropagation
- Initializing the Weights
- The Finished Network
- What You Just Learned
- Hands On: Starting Off Wrong
- How Classifiers Work
- Tracing a Boundary
- Bending the Boundary
- What You Just Learned
- Hands On: Data from Hell
- Batchin’ Up
- Learning, Visualized
- Batch by Batch
- Understanding Batches
- What You Just Learned
- Hands On: The Smallest Batch
- The Zen of Testing
- The Threat of Overfitting
- A Testing Conundrum
- What You Just Learned
- Hands On: Thinking About Testing
- Let’s Do Development
- Preparing Data
- Tuning Hyperparameters
- The Final Test
- Hands On: Achieving 99%
- What You Just Learned… and the Road Ahead
- Designing the Network
- Deep Learning
- A Deeper Kind of Network
- The Echidna Dataset
- Building a Neural Network with Keras
- Making It Deep
- What You Just Learned
- Hands On: Keras Playground
- Defeating Overfitting
- Overfitting Explained
- Regularizing the Model
- A Regularization Toolbox
- What You Just Learned
- Hands On: Keeping It Simple
- Taming Deep Networks
- Understanding Activation Functions
- Beyond the Sigmoid
- Adding More Tricks to Your Bag
- What You Just Learned
- Hands On: The 10 Epochs Challenge
- Beyond Vanilla Networks
- The CIFAR-10 Dataset
- The Building Blocks of CNNs
- Running on Convolutions
- What You Just Learned
- Hands On: Hyperparameters Galore
- Into the Deep
- The Rise of Deep Learning
- Unreasonable Effectiveness
- Where Now?
- Your Journey Begins
- A Deeper Kind of Network