Spark Streaming - Malay
Spark Streaming - Malay
---All the
options
Dstreams are internally, a collection of _______.RDD
HDFS cannot be a sink for Spark Streaming.---False
We cannot configure Twitter as a data source system for Spark Streaming.----False
Dstreams can be created from an existing Dstream.--True
Dstreams cannot be created directly from sources such as Kafka and Flume----False
Internally DStream is represented as a sequence of _____ arriving at discrete time
intervals---RDD
Dstreams are internally, a collection of---RDD
The receiver divides the stream into blocks and keeps them in memory.---True
Block Management units in the worker node reports to ____.---Block Management
Master in the Driver
Block Management Master keeps track of ___.Block id
Starting point of a streaming application is _______.---ssc.start()
With Spark Streaming, the incoming data is split into micro batches---True
Which among the following is true about Spark Streaming?----All the options
Who is responsible for keeping track of the Block Ids?----Block Management Master
in the Driver
Data sources for Spark Streaming that comes under the 'Advanced sources' category
include----All the Options
Batch interval is configured at------creating Spark Streaming Context
For every batch interval, the Driver launches tasks to process a block---True
What is the programming abstraction in Spark Streaming?----Dstreams
What is a batch Interval?----Interval at which a Dstream is processed
Dstreams are internally----Collection of RDD
What is a Sliding Interval?----Interval at which sliding of the window area occur.
Which among the following is true about Window Operations?------Window duration
should be a multiple of batch interval
Dstreams are immutable. Choose the right option.----Yes,Like RDD Dstreams are
immutable
We specify ___________ when we create streaming context.----batch interval
Dstreams are-----Collection of RDD
Which among the following are Basic Sources of Spark Streaming?----Kafka
reduceByKey is a----Action
Reciever recieves data from the Streaming sources at the start of _________.------
Streaming Context
DStream represents a continuous stream of data.---True
Spark Streaming has two categories of sources - Basic sources and Advanced
sources.-----True