Companies Home Search Profile

Apache Spark 2.0 with Java -Learn Spark from a Big Data Guru

Focused View

Tao W.,James Lee,Level Up,Jiarui Zhou

3:24:10

8 View
  • 1 - Course Overview.mp4
    04:08
  • 2 - How to Take this Course and How to Get Support.mp4
    01:24
  • 3 - Text Lecture How to Take this Course and How to Get Support.html
  • 4 - Introduction to Spark.mp4
    02:21
  • 5 - Apache-Spark-with-Java-Slides.pdf
  • 5 - Sides.html
  • 6 - Java 9 Warning.html
  • 7 - Install Java and Git.mp4
    04:11
  • 8 - Source Code.html
  • 9 - Set up Spark project with IntelliJ IDEA.mp4
    07:22
  • 10 - Set up Spark project with Eclipse.mp4
    02:04
  • 11 - Text lecture Set up Spark project with Eclipse.html
  • 12 - Run our first Spark job.mp4
    02:43
  • 13 - Trouble shooting running Hadoop on Windows.html
  • 14 - RDD Basics.mp4
    02:40
  • 15 - Create RDDs.mp4
    02:26
  • 16 - Text Lecture Create RDDs.html
  • 17 - Map and Filter Transformation.mp4
    08:38
  • 18 - Solution to Airports by Latitude Problem.mp4
    01:31
  • 19 - FlatMap Transformation.mp4
    06:27
  • 20 - Text Lectures flatMap Transformation.html
  • 21 - Set Operation.mp4
    07:37
  • 22 - Sampling With Replacement and Sampling Without Replacement.html
  • 23 - Solution for the Same Hosts Problem.html
  • 24 - Actions.mp4
    08:06
  • 25 - Solution to Sum of Numbers Problem.mp4
    01:43
  • 26 - Important Aspects about RDD.mp4
    01:28
  • 27 - Summary of RDD Operations.mp4
    02:23
  • 28 - Caching and Persistence.mp4
    05:09
  • 29 - Spark Architecture.mp4
    02:55
  • 30 - Spark Components.mp4
    05:20
  • 31 - Introduction to Pair RDD.mp4
    01:32
  • 32 - Create Pair RDDs.mp4
    03:54
  • 33 - Filter and MapValue Transformations on Pair RDD.mp4
    04:52
  • 34 - Reduce By Key Aggregation.mp4
    05:14
  • 35 - Sample solution for the Average House problem.mp4
    03:16
  • 36 - Group By Key Transformation.mp4
    04:43
  • 37 - Sort By Key Transformation.mp4
    02:49
  • 38 - Sample Solution for the Sorted Word Count Problem.mp4
    02:00
  • 39 - Data Partitioning.mp4
    04:12
  • 40 - Join Operations.mp4
    04:56
  • 41 - Extra Learning Material How are Big Companies using Apache Spark.html
  • 42 - Accumulators.mp4
    05:30
  • 43 - Text Lecture Accumulators.html
  • 44 - Solution to StackOverflow Survey Followup Problem.mp4
    01:20
  • 45 - Broadcast Variables.mp4
    06:48
  • 46 - Introduction to Spark SQL.mp4
    03:48
  • 47 - Spark SQL in Action.mp4
    14:43
  • 48 - Spark SQL practice House Price Problem.mp4
    01:52
  • 49 - Spark SQL Joins.mp4
    06:21
  • 50 - Strongly Typed Dataset.mp4
    08:31
  • 51 - Use Dataset or RDD.mp4
    02:56
  • 52 - Dataset and RDD Conversion.mp4
    02:58
  • 53 - Performance Tuning of Spark SQL.mp4
    02:44
  • 54 - Extra Learning Material Avoid These Mistakes While Writing Apache Spark Program.html
  • 55 - Introduction to Running Spark in a Cluster.mp4
    04:09
  • 56 - Package Spark Application and Use sparksubmit.mp4
    08:08
  • 57 - Run Spark Application on Amazon EMR Elastic MapReduce cluster.mp4
    13:32
  • 58 - Future Learning.mp4
    02:46
  • 59 - Text Lecture Future Learning.html
  • 60 - Coupons to Our Other Courses.html
  • Description


    Learn analyzing large data sets with Apache Spark by 10+ hands-on examples. Take your big data skills to the next level.

    What You'll Learn?


    • An overview of the architecture of Apache Spark.
    • Work with Apache Spark's primary abstraction, resilient distributed datasets(RDDs) to process and analyze large data sets.
    • Develop Apache Spark 2.0 applications using RDD transformations and actions and Spark SQL.
    • Scale up Spark applications on a Hadoop YARN cluster through Amazon's Elastic MapReduce service.
    • Analyze structured and semi-structured data using Datasets and DataFrames, and develop a thorough understanding about Spark SQL.
    • Share information across different nodes on a Apache Spark cluster by broadcast variables and accumulators.
    • Advanced techniques to optimize and tune Apache Spark jobs by partitioning, caching and persisting RDDs.
    • Best practices of working with Apache Spark in the field.

    Who is this for?


  • Anyone who want to fully understand how Apache Spark technology works and learn how Apache Spark is being used in the field.
  • Software engineers who want to develop Apache Spark 2.0 applications using Spark Core and Spark SQL.
  • Data scientists or data engineers who want to advance their career by improving their big data processing skills.
  • What You Need to Know?


  • A computer running Windows, OSX or Linux
  • Previous Java programming skills
  • Java 8 experience is preferred but NOT required
  • More details


    Description

    What is this course about:

    This course covers all the fundamentals about Apache Spark with Java and teaches you everything you need to know about developing Spark applications with Java. At the end of this course, you will gain in-depth knowledge about Apache Spark and general big data analysis and manipulations skills to help your company to adapt Apache Spark for building big data processing pipeline and data analytics applications.

    This course covers 10+ hands-on big data examples. You will learn valuable knowledge about how to frame data analysis problems as Spark problems. Together we will learn examples such as aggregating NASA Apache web logs from different sources; we will explore the price trend by looking at the real estate data in California; we will write Spark applications to find out the median salary of developers in different countries through the Stack Overflow survey data; we will develop a system to analyze how maker spaces are distributed across different regions in the United Kingdom.  And much much more.

    What will you learn from this lecture:

    In particularly, you will learn:

    • An overview of the architecture of Apache Spark.

    • Develop Apache Spark 2.0 applications with Java using RDD transformations and actions and Spark SQL.

    • Work with Apache Spark's primary abstraction, resilient distributed datasets(RDDs) to process and analyze large data sets.

    • Deep dive into advanced techniques to optimize and tune Apache Spark jobs by partitioning, caching and persisting RDDs.

    • Scale up Spark applications on a Hadoop YARN cluster through Amazon's Elastic MapReduce service.

    • Analyze structured and semi-structured data using Datasets and DataFrames, and develop a thorough understanding of Spark SQL.

    • Share information across different nodes on an Apache Spark cluster by broadcast variables and accumulators.
    • Best practices of working with Apache Spark in the field.

    • Big data ecosystem overview.

    Why shall we learn Apache Spark:

    Apache Spark gives us unlimited ability to build cutting-edge applications. It is also one of the most compelling technologies of the last decade in terms of its disruption to the big data world.

    Spark provides in-memory cluster computing which greatly boosts the speed of iterative algorithms and interactive data mining tasks.

    Apache Spark is the next-generation processing engine for big data.

    Tons of companies are adapting Apache Spark to extract meaning from massive data sets, today you have access to that same big data technology right on your desktop.

    Apache Spark is becoming a must tool for big data engineers and data scientists.

    About the author:

    Since 2015, James has been helping his company to adapt Apache Spark for building their big data processing pipeline and data analytics applications.

    James' company has gained massive benefits by adapting Apache Spark in production. In this course, he is going to share with you his years of knowledge and best practices of working with Spark in the real field.

    Why choosing this course?

    This course is very hands-on, James has put lots effort to provide you with not only the theory but also real-life examples of developing Spark applications that you can try out on your own laptop.

    James has uploaded all the source code to Github and you will be able to follow along with either Windows, MAC OS or Linux.

    In the end of this course, James is confident that you will gain in-depth knowledge about Spark and general big data analysis and data manipulation skills. You'll be able to develop Spark application that analyzes Gigabytes scale of data both on your laptop, and in the cloud using Amazon's Elastic MapReduce service!

    30-day Money-back Guarantee!

    You will get 30-day money-back guarantee from Udemy for this course.

     If not satisfied simply ask for a refund within 30 days. You will get a full refund. No questions whatsoever asked.

    Are you ready to take your big data analysis skills and career to the next level, take this course now!

    You will go from zero to Spark hero in 4 hours.

    Who this course is for:

    • Anyone who want to fully understand how Apache Spark technology works and learn how Apache Spark is being used in the field.
    • Software engineers who want to develop Apache Spark 2.0 applications using Spark Core and Spark SQL.
    • Data scientists or data engineers who want to advance their career by improving their big data processing skills.

    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Tao is a passionate software engineer who works in a leading big data analysis company in Silicon Valley. Previously Tao has worked in big IT companies such as IBM and Tao.Tao has a MS degree in Computer Science from University of McGill and many years of experience as a teaching assistant for various computer science classes.When Tao is not working, Tao enjoys reading and swimming, and he is a passionate photographer.
    James Lee is a passionate software wizard working at one of the top Silicon Valley-based startups specializing in big data analysis.  In the past, he has worked on big companies such as Google and Amazon  In his day job, he works with big data technologies such as Cassandra and ElasticSearch, and he is an absolute Docker technology geek and IntelliJ IDEA lover with strong focus on efficiency and simplicity. Apart from his career as a software engineer, he is keen on sharing his knowledge with others and guiding them especially for startups and programming. He has been teaching courses and conducting workshops on Java programming / IntelliJ IDEA since he was 21. He enjoys working with Udemy because here he can share all his field knowledge and secrets with a broader audience. He hopes students will definitely benefit from his years of experience. The students will be thrilled of association with James and Udemy. And we are also excited to have you on board. James Lee has a MS degree in Computer Science from McGill University and many years of experience as a teaching assistant for various computer science classes. James Lee also enjoys skiing and swimming, and he is a passionate traveler.
    Skilled programmers remain in high demand in this digitally-focused world. Level-up offers practical and engaging learning solution that is revolutionizing professional online training.  Level-up provides courses delivered by top industry experts and well-designed real-life course projects  We teach technology the way it is used in the industry world.  We offer a range of courses that teach you from the fundamentals of programming to advanced topics in the areas of Big Data and DevOps, Data Science and Apache Spark, etc The Level-up Udemy courses are your gateway to high-quality software courses from industry experts and influencers.
    Jiarui Zhou, a 17-year-old student from Abbey Park high school, is passionate about Computer Science and Computer Engineering. With an advanced proficiency in Python and a solid grasp of other languages, including Java, Jiarui embodies a blend of youthful enthusiasm and technical expertise. His academic journey reflects a deep commitment to exploring the ever-evolving landscape of technology and a keen interest in applying his skills in real-world scenarios.
    Students take courses primarily to improve job-related skills.Some courses generate credit toward technical certification. Udemy has made a special effort to attract corporate trainers seeking to create coursework for employees of their company.
    • language english
    • Training sessions 45
    • duration 3:24:10
    • English subtitles has
    • Release Date 2024/04/28