資料簡介:
機器學習正在蠶食軟件世界。在這本Sebastian Raschka的暢銷書《Python機器學習(第二版)》中,你將了解并學習到機器學習、神經(jīng)網(wǎng)絡和深度學習的 前沿知識。 塞巴斯蒂安·拉施卡、瓦希德·麥加利利著的《Python機器學習》 新并擴展了包括scikit-learn、Keras、TensorFlow在內(nèi)的 開源技術(shù)。書中提供了使用Python創(chuàng)建有效的機器學習和深度學習應用所需的實用知識和技術(shù)。 在涉及數(shù)據(jù)分析的 主題之前,Sebastian Raschka和Vahid Mirjalili以其獨特見解和專業(yè)知識為你介紹機器學習和深度學習算法。本書將機器學習的理論原理與實際編碼方法相結(jié)合,以求全面掌握機器學習理論及其Python實現(xiàn)。
資料目錄:
Chapter 1: Giving Computers the Ability_ to Learn from Data
Building intelligent machines to transform data into knowledge
The three different types of machine learning
Making predictions about the future with supervised learning
Classification for predicting class labels
Regression for predicting continuous outcomes
Solving interactive problems with reinforcement learning
Discovering hidden structures with unsupervised learning
Finding subgroups with clustering
Dimensionality reduction for data compression
Introduction to the basic terminology and notations
A roadmap for building machine learning systems
Preprocessing - getting data into shape
Training and selecting a predictive model
Evaluating models and predicting unseen data instances
Using Python for machine learning
Installing Python and packages from the Python Package Index
Using the Anaconda Python distribution and package manager
Packages for scientific computing, data science, and machine learning
Summary
Chapter 2: Training Simple Machine Learning Algorithms
for Classification
Artificial neurons - a brief glimpse into the early history of
machine learning
The formal definition of an artificial neuron
The perceptron learning rule
Implementing a perceptron learning algorithm in Python
An object-oriented perceptron API
Training a perceptron model on the Iris dataset
Adaptive linear neurons and the convergence of learning
Minimizing cost functions with gradient descent
Implementing Adaline in Python
Improving gradient descent through feature scaling
Large-scale machine learning and stochastic gradient descent
Summary
Chapter 3: A Tour of Machine Learning Classifiers
Using scikit-learn
Choosing a classification algorithm
First steps with scikit-learn - training a perceptron
Modeling class probabilities via logistic regression
Logistic regression intuition and conditional probabilities
Learning the weights of the logistic cost function
Converting an Adaline implementation into an algorithm for
logistic regression
Training a logistic regression model with scikit-learn
Tackling overfitting via regularization
Maximum margin classification with support vector machines
Maximum margin intuition
Dealing with a nonlinearly separable case using slack variables