Companies Home Search Profile
Hands-On Deep Learning with Apache Spark: Build and deploy distributed deep learning applications on Apache Spark
Hands-On Deep Learning with Apache Spark: Build and deploy distributed deep learning applications on Apache Spark
Download pdf
Hands-On Deep Learning with Apache Spark: Build and deploy distributed deep learning applications on Apache Spark

Hands-On Deep Learning with Apache Spark: Build and deploy distributed deep learning applications on Apache Spark

Publication

Packt Publishing

0 View
Speed up the design and implementation of deep learning solutions using Apache Spark

Deep learning is a subset of machine learning where datasets with several layers of complexity can be processed. Hands-On Deep Learning with Apache Spark addresses the sheer complexity of technical and analytical parts and the speed at which deep learning solutions can be implemented on Apache Spark.

The book starts with the fundamentals of Apache Spark and deep learning. You will set up Spark for deep learning, learn principles of distributed modeling, and understand different types of neural nets. You will then implement deep learning models, such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), and long short-term memory (LSTM) on Spark.

As you progress through the book, you will gain hands-on experience of what it takes to understand the complex datasets you are dealing with. During the course of this book, you will use popular deep learning frameworks, such as TensorFlow, Deeplearning4j, and Keras to train your distributed models.

By the end of this book, you'll have gained experience with the implementation of your models on a variety of use cases.

ISBN-10
1788994612
ISBN-13
978-1788994613
Publisher
Packt Publishing
Price
46.99
File Type
PDF
Page No.
322

About the Author

Guglielmo Iozzia is currently a Big Data Delivery Manager at Optum in Dublin (Ireland).

He completed his Masters' Degree in Biomedical Engineering at the University of Bologna (Italy). For his final year engineering project, he designed and implemented a diagnostic system to predict the behaviour of the intracranial pressure on patients in neurosurgery intensive care. The project was part of a bigger one in a collaboration between the DEIS (Department of Engineering, Information and Systems) of the University of Bologna and the Policlinico Hospital of Milan and it was carried out using real patients' data.

After his graduation, he joined a newborn IT company in Bologna which had implemented a new system to manage online payments. The company grew rapidly and expanded its business on different sectors (banking, manufacturing, public administration), so he had a chance to work on complex Java projects for different customers in different areas. Among these projects, GDPM deserves special mention. It is a predictive maintenance system for the machinery produced by the G.D group. Guglielmo was an active part of the initial design and first implementation and release to production.

6 years later he moved to Rome where after a short experience as a consultant in IFAD, he moved to RAI Net (RAI Television group, before joining the IT department of FAO, an agency of the United Nations, for more than 5 years.

In 2013 he had a chance to join IBM in Dublin. There he improved his DevOps skills working mostly on cloud-based applications and the opportunity to move to Big Data and Machine Learning applied to Operations first and Cybersecurity then.

At the end of September 2016 he moved to Optum (which is part of the UnitedHealth Group), the healthcare IT company he works for at present time. He and his teams are involved in Big Data and Analytics projects mostly in the Payment Integrity space, in particular in the Fraud, Waste and Abuse detection and prevention.

He is a golden member and writes articles at DZone and maintains a personal blog to share his finding and thoughts about different tech topics (Java, Scala, Big Data, AI, DevOps, Open Source).

  • Understand the basics of deep learning
  • Set up Apache Spark for deep learning
  • Understand the principles of distribution modeling and different types of neural networks
  • Obtain an understanding of deep learning algorithms
  • Discover textual analysis and deep learning with Spark
  • Use popular deep learning frameworks, such as Deeplearning4j, TensorFlow, and Keras
  • Explore popular deep learning algorithms

If you are a Scala developer, data scientist, or data analyst who wants to learn how to use Spark for implementing efficient deep learning models, Hands-On Deep Learning with Apache Spark is for you. Knowledge of the core machine learning concepts and some exposure to Spark will be helpful.

  1. The Apache Spark Ecosystem
  2. Deep Learning Basics
  3. Extract, Transform, Load
  4. Streaming
  5. Convolutional Neural Networks
  6. Recurrent Neural Networks
  7. Training Neural Networks with Spark
  8. Monitoring and Debugging Neural Network Training
  9. Interpreting Neural Network Output
  10. Deploying on a Distributed System
  11. NLP Basics
  12. Textual Analysis and Deep Learning
  13. Convolution
  14. Image Classification
  15. What's Next for Deep Learning?

Similar Books

Other Authors' Books

Other Publishing Books

User Reviews
Rating
0
0
0
0
0
average 0
Total votes0