Cover -- Half Title -- Series Page -- Title Page -- Copyright Page -- Dedication -- Table of Contents -- 1: No Time to Lose: Time Series Analysis -- 1.1 Time Series -- 1.2 One at a Time: Some Examples -- 1.3 Bearing with Time: Pandas Series -- 1.3.1 Pandas Time Series in Action -- 1.3.2 Time Series Data Manipulation -- 1.4 Modelling Time Series Data -- 1.4.1 Regression. . . (Not) a Good Idea? -- 1.4.2 Moving Averages and Exponential Smoothing -- 1.4.3 Stationarity and Seasonality -- 1.4.4 Determining Stationarity -- 1.4.5 Autoregression to the Rescue -- 1.5 Autoregressive Models -- 1.6 Summary -- 2: Speaking Naturally: Text and Natural Language Processing -- 2.1 Pages and Pages: Accessing Data from the Web -- 2.1.1 Beautiful Soup in Action -- 2.2 Make Mine a Regular: Regular Expressions -- 2.2.1 Regular Expression Patterns -- 2.3 Processing Text with Unicode -- 2.4 Tokenising Text -- 2.5 Word Tagging -- 2.6 What Are You Talking About?: Topic Modelling -- 2.6.1 Latent Dirichlet Allocation -- 2.6.2 LDA in Action -- 2.7 Summary -- 3: Getting Social: Graph Theory and Social Network Analysis -- 3.1 Socialising Among Friends and Foes -- 3.2 Let's Make a Connection: Graphs and Networks -- 3.2.1 Taking the Measure: Degree, Centrality and More -- 3.2.2 Connecting the Dots: Network Properties -- 3.3 Social Networks with Python: NetworkX -- 3.3.1 NetworkX: A Quick Intro -- 3.4 Social Network Analysis in Action -- 3.4.1 Karate Kids: Conflict and Fission in a Network -- 3.4.2 In a Galaxy Far, Far Away: Central Characters in a Network -- 3.5 Summary -- 4: Thinking Deeply: Neural Networks and Deep Learning -- 4.1 A Trip Down Memory Lane -- 4.2 No-Brainer: What Are Neural Networks? -- 4.2.1 Neural Network Architecture: Layers and Nodes -- 4.2.2 Firing Away: Neurons, Activate! -- 4.2.3 Going Forwards and Backwards -- 4.3 Neural Networks: From the Ground up.