Spark Streaming Introduction

Streaming applications are new Hot Thing in data processing world today. everyone wants to process data as soon as it is available. To cater to this need , there are new set of tools in market. Spark Tries to provide streaming using its basic batch processing API. It has been very successful and accepted well in industry for this. Spark Treats input stream of messages as mini batch. These batches run as frequently as possible. Minimum time interval is 1 sec. after every one second a new RDD is generated from stream of input messages and data is processed in batch way. It gives a feeling to client that he is using batch processing system. As part of the following video i have covered basic concepts of spark streaming . Please watch this video and share your thoughts.

Please subscribe to channel and share what other topics you would like me to cover.  


Comments

  1. You are doing a great job by sharing useful information about Apache spark course. It is one of the post to read and improve my knowledge in Apache spark.You can check our Apache spark Introduction Tutorial,for more information about Apache Spark Introduction.

    ReplyDelete

Post a Comment

Popular posts from this blog

Hive UDF Example

Custom UDF in Apache Spark

Hadoop series : Pseudo Distributed Mode Hadoop Instalation