Database Reference
In-Depth Information
14/11/15 21:02:23 INFO dstream.SocketInputDStream:
Checkpoint interval = null
14/11/15 21:02:23 INFO dstream.SocketInputDStream: Remember
duration = 10000 ms
14/11/15 21:02:23 INFO dstream.SocketInputDStream:
Initialized and validated
org.apache.spark.streaming.dstream.SocketInputDStream@ff3436d
14/11/15 21:02:23 INFO dstream.ForEachDStream: Slide time =
10000 ms
14/11/15 21:02:23 INFO dstream.ForEachDStream: Storage
level = StorageLevel(false, false, false, false, 1)
14/11/15 21:02:23 INFO dstream.ForEachDStream: Checkpoint
interval = null
14/11/15 21:02:23 INFO dstream.ForEachDStream: Remember
duration = 10000 ms
14/11/15 21:02:23 INFO dstream.ForEachDStream: Initialized
and validated
org.apache.spark.streaming.dstream.ForEachDStream@5a10b6e8
14/11/15 21:02:23 INFO scheduler.ReceiverTracker: Starting
1 receivers
14/11/15 21:02:23 INFO spark.SparkContext: Starting job:
runJob at ReceiverTracker.scala:275
...
At the same time, you should see that the terminal window running the producer displays
something like the following:
...
Got client connected from: /127.0.0.1
Created 2 events...
Created 2 events...
Created 3 events...
Created 1 events...
Created 5 events...
...
After about 10 seconds, which is the time of our streaming batch interval, Spark Stream-
ing will trigger a computation on the stream due to our use of the print operator. This
Search WWH ::




Custom Search