Apache Flink is an open source platform for scalable batch and stream data processing. Flink supports batch and streaming analytics, in one system. Analytical programs can be written in concise and elegant APIs in Java and Scala.

learn more…| top users | synonyms (1)

0
votes
1answer
18 views

Process consecutive messages

I have a stream of object coordinates (time, x, y), I want to transform it into a stream of distances and then to a stream of speeds. To do it I need to process two consecutive messages each time. ...
0
votes
0answers
15 views

Flink back pressure indication - how to identify its root cause

How can I identify the root cause for back pressure in a task? (i.e - which operator of a multi-operator-task is causing back pressure)Are there any relevant logs? (failed tracking ...
0
votes
0answers
10 views

docker-flink not showing all log statements

I am using 2 docker flink images with AMIDST and the following sample code. AMIDST is a probabilistic graphical model framework which supports Flink.One image is running as JobManager the other as ...
0
votes
1answer
33 views

Flink - Grouping query to external system per operator instance while enriching an event

I am currently writing a streaming application where:as an input, I am receiving some alerts from a kafka topic (1 alert is linked to 1 resource, for example 1 alert will be linked to my-router-1 or ...
0
votes
1answer
51 views

Reading csv file by Flink, scala, addSource and readCsvFile

I'd want to read csv file using by Flink, Scala-language and addSource- and readCsvFile-functions. I have not found any simple examples about that. I have only found: https://github.com/dataArtisans/...
1
vote
1answer
22 views

Apache Flink CI/CD--Kubernetes--Github

Has someone successfully run Flink jobs with this kind of setup (Github CI CD and Kubernetes)?Since Flink jobs can’t be dockerized and deployed in a natural way as partof the container I am not ...
3
votes
1answer
31 views

Backpressure in connected flink streams

I am experimenting with how to propagate back-pressure correctly when I have ConnectedStreams as part of my computation graph. The problem is: I have two sources and one ingests data faster than the ...
0
votes
0answers
10 views

move files after reading in flink streaming

How to move file source after processing using apache flink?String ftpUser=URLEncoder.encode(username, "UTF-8");String ftpPass=URLEncoder.encode(password, "UTF-8");URL url=...
0
votes
0answers
21 views
+50

Apache flink ftp soure file not found

I have flink stream processor which is running fine for hours but gets sudden error. There is file not found error but when I check I found the file there.Code:String ftpUser=URLEncoder.encode(...
0
votes
0answers
9 views

Uploading jars using Flink REST /jars/upload

The REST service to upload a jar to Flink requires both a filename and filepath query parameter. The filepath parameter specifies where the jar file is located. Uploading from a remote server, I do ...
1
vote
1answer
25 views

Unit Testing Flink Functions

I am using Flink v.1.4.0.I have implemented a module, as part of a package I am developing, whose role is to deduplicate a stream. The module is quite simple: public class RemoveDuplicateFilter<...
0
votes
0answers
7 views

Exiting computation for some nodes in flink gelly

Is there any way to exit computation for some specific nodes (specified by some conditions)? Just like voteToHalt() function in Giraph.
1
vote
1answer
30 views

Flink, why a CoMap returns “DataStream with Product with Serializable” instead of just a DataStream?

I need to understand why eventStream.connect(otherStream).map(_=> Right(2), _=> Left("2")) does not generate a DataStream[Either[String, Int]] but a DataStream[Either[String, Int]] with ...
0
votes
0answers
30 views

Flink - No FileSystem for scheme: hdfs

I am currently developing a Flink 1.4 application that reads an Avro file from a Hadoop cluster. However, running it in local mode on my IDE works perfectly fine. But when I submit it to the ...
0
votes
0answers
13 views

Flink 1.4.0 Kafka connector assign partition

In Kafka, you can assign the partitions using KafkaConsumer.assign:KafkaConsumer<String, String> consumer=new KafkaConsumer<>(prop);consumer.assign(partitions);Is there a way to do ...

153050per page
angop.ao, elkhabar.com, noa.al, afghanpaper.com, bbc.com, time.com, cdc.gov, nih.gov, xnxx.com, github.com,