Hands-On Deep Learning with Apache Spark: Build and deploy distributed deep learning applications on Apache Spark
by Guglielmo Iozzia
Speed up the design and implementation of deep learning solutions using Apache Spark
- Explore the world of distributed deep learning with Apache Spark
- Train neural networks with deep learning libraries such as BigDL and TensorFlow
- Develop Spark deep learning applications to intelligently handle large and complex datasets
Deep learning is a subset of machine learning where datasets with several layers of complexity can be processed. Hands-On Deep Learning with Apache Spark addresses the sheer complexity of technical and analytical parts and the speed at which deep learning solutions can be implemented on Apache Spark.
The book starts with the fundamentals of Apache Spark and deep learning. You will set up Spark for deep learning, learn principles of distributed modeling, and understand different types of neural nets. You will then implement deep learning models, such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory (LSTM) on Spark.
As you progress through the book, you will gain hands-on experience of what it takes to understand the complex datasets you are dealing with. During the course of this book, you will use popular deep learning frameworks, such as TensorFlow, Deeplearning4j, and Keras to train your distributed models.
By the end of this book, you'll have gained experience with the implementation of your models on a variety of use cases.
What you will learn
- Understand the basics of deep learning
- Set up Apache Spark for deep learning
- Understand the principles of distribution modeling and different types of neural networks
- Obtain an understanding of deep learning algorithms
- Discover textual analysis and deep learning with Spark
- Use popular deep learning frameworks, such as Deeplearning4j, TensorFlow, and Keras
- Explore popular deep learning algorithms
Who this book is for
If you are a Scala developer, data scientist, or data analyst who wants to learn how to use Spark for implementing efficient deep learning models, Hands-On Deep Learning with Apache Spark is for you. Knowledge of the core machine learning concepts and some exposure to Spark will be helpful.
Apache Spark 2.x for Java Developers: Explore big data at scale using Apache Spark 2.x Java APIs Kindle Edition
by Sourav Gulati
- Perform big data processing with Spark-without having to learn Scala!
- Use the Spark Java API to implement efficient enterprise-grade applications for data processing and analytics
- Go beyond mainstream data processing by adding querying capability, Machine Learning, and graph processing using Spark
Apache Spark is the buzzword in the big data industry right now, especially with the increasing need for real-time streaming and data processing. While Spark is built on Scala, the Spark Java API exposes all the Spark features available in the Scala version for Java developers. This book will show you how you can implement various functionalities of the Apache Spark framework in Java, without stepping out of your comfort zone.
The book starts with an introduction to the Apache Spark 2.x ecosystem, followed by explaining how to install and configure Spark, and refreshes the Java concepts that will be useful to you when consuming Apache Spark's APIs. You will explore RDD and its associated common Action and Transformation Java APIs, set up a production-like clustered environment, and work with Spark SQL. Moving on, you will perform near-real-time processing with Spark Streaming, Machine Learning analytics with Spark MLlib, and graph processing with GraphX, all using various Java packages.
By the end of the book, you will have a solid foundation in implementing components in the Spark framework in Java to build fast, real-time applications.
What you will learn
- Process data using different file formats such as XML, JSON, CSV, and plain and delimited text, using the Spark core Library.
- Perform analytics on data from various data sources such as Kafka, and Flume using Spark Streaming Library
- Learn SQL schema creation and the analysis of structured data using various SQL functions including Windowing functions in the Spark SQL Library
- Explore Spark Mlib APIs while implementing Machine Learning techniques to solve real-world problems
- Get to know Spark GraphX so you understand various graph-based analytics that can be performed with Spark