Running Flink on java 11

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Running Flink on java 11

KristoffSC
Hi guys,
well We have requirement in our project to use Java 11, although we would
really like to use Flink because it seems to match our needs perfectly.

We were testing it on java 1.8 and all looks fine.
We tried to run it on Java 11 and also looks fine, at least for now.

We were also running this as a Job Cluster, and since those images [1] are
based on openjdk:8-jre-alpine we switch to java 13-jdk-alpine. Cluster
started and submitted the job. All seemed fine.

The Job and 3rd party library that this job is using were compiled with Java
11.
I was looking for any posts related to java 11 issues and I've found this
[2] one.
We are also aware of ongoing FLINK-10725 [3] but this is assigned to 1.10
FLink version

Having all of this, I would like to ask few questions

1. Is there any release date planed for 1.10?
2. Are you aware of any issues regarding running Flink on Java 11?
3. If my Job code would not use any code features from java 11, would flink
handle it when running on java 11? Or they are some internal functionalities
that would not be working on Java 11 (things that are using unsafe or
reflections?)

Thanks,
Krzysztof


[1]
https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md
[2]
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/UnsupportedOperationException-from-org-apache-flink-shaded-asm6-org-objectweb-asm-ClassVisitor-visit1-td28571.html
[3] https://issues.apache.org/jira/browse/FLINK-10725



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Running Flink on java 11

KristoffSC
Hi,
Thank you for your answer. Btw it seams that you send the replay only to my address and not to the mailing list :)

I'm looking forward to try out 1.10-rc then.

Regarding second thing you wrote, that
"on Java 11, all the tests(including end to end tests) would be run with Java 11 profile now."
I'm not sure if I get that fully. You meant that currently Flink builds are running on java 11?

I was not rebuilding Flink 1.9.1 sources with JDK 11. I just ran 1.9.1 build on JRE 11 locally on my machine and
I also modify Job Cluster Dockerfile to use openjdk:13-jdk-alpine as a base image instead openjdk:8-jre-alpine.

Are here any users who are currently running Flink on Java 11 or higher? What are your experiences? 

Thanks,

pt., 10 sty 2020 o 03:14 Yangze Guo <[hidden email]> napisał(a):
Hi, Krzysztof

Regarding the release-1.10, the community is now focus on this effort.
I believe we will have our first release candidate soon.

Regarding the issue when running Flink on Java 11, all the
tests(including end to end tests) would be run with Java 11 profile
now. If you meet any problem, feel free to open a new JIRA ticket or
ask in user/dev ML.

Best,
Yangze Guo

On Fri, Jan 10, 2020 at 1:11 AM KristoffSC
<[hidden email]> wrote:
>
> Hi guys,
> well We have requirement in our project to use Java 11, although we would
> really like to use Flink because it seems to match our needs perfectly.
>
> We were testing it on java 1.8 and all looks fine.
> We tried to run it on Java 11 and also looks fine, at least for now.
>
> We were also running this as a Job Cluster, and since those images [1] are
> based on openjdk:8-jre-alpine we switch to java 13-jdk-alpine. Cluster
> started and submitted the job. All seemed fine.
>
> The Job and 3rd party library that this job is using were compiled with Java
> 11.
> I was looking for any posts related to java 11 issues and I've found this
> [2] one.
> We are also aware of ongoing FLINK-10725 [3] but this is assigned to 1.10
> FLink version
>
> Having all of this, I would like to ask few questions
>
> 1. Is there any release date planed for 1.10?
> 2. Are you aware of any issues regarding running Flink on Java 11?
> 3. If my Job code would not use any code features from java 11, would flink
> handle it when running on java 11? Or they are some internal functionalities
> that would not be working on Java 11 (things that are using unsafe or
> reflections?)
>
> Thanks,
> Krzysztof
>
>
> [1]
> https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md
> [2]
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/UnsupportedOperationException-from-org-apache-flink-shaded-asm6-org-objectweb-asm-ClassVisitor-visit1-td28571.html
> [3] https://issues.apache.org/jira/browse/FLINK-10725
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Running Flink on java 11

Yangze Guo
Hi Krzysztof

All the tests run with Java 11 after FLINK-13457[1]. Its fix version
is set to 1.10. So, I fear 1.9.1 is not guaranteed to be running on
java 11. I suggest you to wait for the release-1.10.

[1]https://issues.apache.org/jira/browse/FLINK-13457

Best,
Yangze Guo

On Fri, Jan 10, 2020 at 5:13 PM Krzysztof Chmielewski
<[hidden email]> wrote:

>
> Hi,
> Thank you for your answer. Btw it seams that you send the replay only to my address and not to the mailing list :)
>
> I'm looking forward to try out 1.10-rc then.
>
> Regarding second thing you wrote, that
> "on Java 11, all the tests(including end to end tests) would be run with Java 11 profile now."
> I'm not sure if I get that fully. You meant that currently Flink builds are running on java 11?
>
> I was not rebuilding Flink 1.9.1 sources with JDK 11. I just ran 1.9.1 build on JRE 11 locally on my machine and
> I also modify Job Cluster Dockerfile to use openjdk:13-jdk-alpine as a base image instead openjdk:8-jre-alpine.
>
> Are here any users who are currently running Flink on Java 11 or higher? What are your experiences?
>
> Thanks,
>
> pt., 10 sty 2020 o 03:14 Yangze Guo <[hidden email]> napisał(a):
>>
>> Hi, Krzysztof
>>
>> Regarding the release-1.10, the community is now focus on this effort.
>> I believe we will have our first release candidate soon.
>>
>> Regarding the issue when running Flink on Java 11, all the
>> tests(including end to end tests) would be run with Java 11 profile
>> now. If you meet any problem, feel free to open a new JIRA ticket or
>> ask in user/dev ML.
>>
>> Best,
>> Yangze Guo
>>
>> On Fri, Jan 10, 2020 at 1:11 AM KristoffSC
>> <[hidden email]> wrote:
>> >
>> > Hi guys,
>> > well We have requirement in our project to use Java 11, although we would
>> > really like to use Flink because it seems to match our needs perfectly.
>> >
>> > We were testing it on java 1.8 and all looks fine.
>> > We tried to run it on Java 11 and also looks fine, at least for now.
>> >
>> > We were also running this as a Job Cluster, and since those images [1] are
>> > based on openjdk:8-jre-alpine we switch to java 13-jdk-alpine. Cluster
>> > started and submitted the job. All seemed fine.
>> >
>> > The Job and 3rd party library that this job is using were compiled with Java
>> > 11.
>> > I was looking for any posts related to java 11 issues and I've found this
>> > [2] one.
>> > We are also aware of ongoing FLINK-10725 [3] but this is assigned to 1.10
>> > FLink version
>> >
>> > Having all of this, I would like to ask few questions
>> >
>> > 1. Is there any release date planed for 1.10?
>> > 2. Are you aware of any issues regarding running Flink on Java 11?
>> > 3. If my Job code would not use any code features from java 11, would flink
>> > handle it when running on java 11? Or they are some internal functionalities
>> > that would not be working on Java 11 (things that are using unsafe or
>> > reflections?)
>> >
>> > Thanks,
>> > Krzysztof
>> >
>> >
>> > [1]
>> > https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md
>> > [2]
>> > http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/UnsupportedOperationException-from-org-apache-flink-shaded-asm6-org-objectweb-asm-ClassVisitor-visit1-td28571.html
>> > [3] https://issues.apache.org/jira/browse/FLINK-10725
>> >
>> >
>> >
>> > --
>> > Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Running Flink on java 11

Chesnay Schepler
In reply to this post by KristoffSC
In regards to what we test:

We run our tests against Java 8 and Java 11, with the compilation and testing being done with the same JDK.
In other words, we don't check whether Flink compiled with JDK 8 runs on JDK 11, but we currently have no reason to believe that there is a problem (and some anecdotal evidence to support this).
Some tests are excluded from Java 11; some due to library incompatibility (e.g., Cassandra), some because our docker images are still using Java 8 (anything docker-related stuff is _not_ tested for Java 11, as are Yarn end-to-end tests), and some for currently unknown reasons (Kafka end-to-end test).

You _should_ be free to use any JDK 11 exclusive APIs within your code, but we haven't tested it specifically.

You don't have to re-compile Flink; the binary that we will be releasing should work on both Java 8 and 11.

Do note that we have not done any tests on JDK 12+.

On 10/01/2020 10:13, Krzysztof Chmielewski wrote:
Hi,
Thank you for your answer. Btw it seams that you send the replay only to my address and not to the mailing list :)

I'm looking forward to try out 1.10-rc then.

Regarding second thing you wrote, that
"on Java 11, all the tests(including end to end tests) would be run with Java 11 profile now."
I'm not sure if I get that fully. You meant that currently Flink builds are running on java 11?

I was not rebuilding Flink 1.9.1 sources with JDK 11. I just ran 1.9.1 build on JRE 11 locally on my machine and
I also modify Job Cluster Dockerfile to use openjdk:13-jdk-alpine as a base image instead openjdk:8-jre-alpine.

Are here any users who are currently running Flink on Java 11 or higher? What are your experiences? 

Thanks,

pt., 10 sty 2020 o 03:14 Yangze Guo <[hidden email]> napisał(a):
Hi, Krzysztof

Regarding the release-1.10, the community is now focus on this effort.
I believe we will have our first release candidate soon.

Regarding the issue when running Flink on Java 11, all the
tests(including end to end tests) would be run with Java 11 profile
now. If you meet any problem, feel free to open a new JIRA ticket or
ask in user/dev ML.

Best,
Yangze Guo

On Fri, Jan 10, 2020 at 1:11 AM KristoffSC
<[hidden email]> wrote:
>
> Hi guys,
> well We have requirement in our project to use Java 11, although we would
> really like to use Flink because it seems to match our needs perfectly.
>
> We were testing it on java 1.8 and all looks fine.
> We tried to run it on Java 11 and also looks fine, at least for now.
>
> We were also running this as a Job Cluster, and since those images [1] are
> based on openjdk:8-jre-alpine we switch to java 13-jdk-alpine. Cluster
> started and submitted the job. All seemed fine.
>
> The Job and 3rd party library that this job is using were compiled with Java
> 11.
> I was looking for any posts related to java 11 issues and I've found this
> [2] one.
> We are also aware of ongoing FLINK-10725 [3] but this is assigned to 1.10
> FLink version
>
> Having all of this, I would like to ask few questions
>
> 1. Is there any release date planed for 1.10?
> 2. Are you aware of any issues regarding running Flink on Java 11?
> 3. If my Job code would not use any code features from java 11, would flink
> handle it when running on java 11? Or they are some internal functionalities
> that would not be working on Java 11 (things that are using unsafe or
> reflections?)
>
> Thanks,
> Krzysztof
>
>
> [1]
> https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md
> [2]
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/UnsupportedOperationException-from-org-apache-flink-shaded-asm6-org-objectweb-asm-ClassVisitor-visit1-td28571.html
> [3] https://issues.apache.org/jira/browse/FLINK-10725
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/


Reply | Threaded
Open this post in threaded view
|

Re: Running Flink on java 11

KristoffSC
Hi,
Yangze Guo, Chesnay Schepler thank you very much for your answers.

I have actually a funny setup.
So I have a Flink Job module, generated from Flink's maven archetype.
This module has all operators and Flink environment config and execution.
This module is compiled by maven with "maven.compiler.target" set to 1.8

However I'm using a 3rd party library that was compiled with java 11.
In order to build my main Job module I have to use JDK 11, however I still
have "maven.compiler.target" set to 1.8 there.

As a result, I have a Flink job jar, that has classes from Java 8 and 11.
Running javap -verbose proves it. All classes from Flink Job module are in
Java 8.

I can build Flink Job cluster image that is based on [1]. However i had to
change base image from openjdk:8-jre-alpine to
adoptopenjdk/openjdk11:jre-11.0.5_10-alpine plus remove installing
libc6-compat.

After rebuilding the docker image, Job cluster started and process messges.

On original openjdk:8-jre-alpine it was unable to start due issues with
loading classes from my 3rd party library (Unsupported major.minor version
exception)
So this seems to work.


However if I would change "maven.compiler.target" to Java 11 in my Flink Job
module, then Flink is unable to run the Job giving me this exception

job-cluster_1  | Caused by: java.lang.UnsupportedOperationException
job-cluster_1  |        at
org.apache.flink.shaded.asm6.org.objectweb.asm.ClassVisitor.visitNestHostExperimental(ClassVisitor.java:158)
job-cluster_1  |        at
org.apache.flink.shaded.asm6.org.objectweb.asm.ClassReader.accept(ClassReader.java:541)
job-cluster_1  |        at
org.apache.flink.shaded.asm6.org.objectweb.asm.ClassReader.accept(ClassReader.java:391)
job-cluster_1  |        at
org.apache.flink.api.java.ClosureCleaner.cleanThis0(ClosureCleaner.java:187)
job-cluster_1  |        at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:100)
job-cluster_1  |        at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
job-cluster_1  |        at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:71)
job-cluster_1  |        at
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1574)
job-cluster_1  |        at
org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:185)
job-cluster_1  |        at
org.apache.flink.streaming.api.datastream.DataStream.process(DataStream.java:668)
job-cluster_1  |        at
org.apache.flink.streaming.api.datastream.DataStream.process(DataStream.java:645)
job-cluster_1  |        at
com.epam.monity.job.StreamJob.doTheJob(StreamJob.java:140)
job-cluster_1  |        at
com.epam.monity.job.StreamJob.main(StreamJob.java:46)
job-cluster_1  |        at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
job-cluster_1  |        at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown
Source)
job-cluster_1  |        at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
Source)
job-cluster_1  |        at java.base/java.lang.reflect.Method.invoke(Unknown
Source)
job-cluster_1  |        at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
job-cluster_1  |        ... 13 more



Long story short, seems that for now, Job module has to be compiled to 1.8
with JDK 11 if Java 11 libraries are used.

[1]
https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md





--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Running Flink on java 11

Chesnay Schepler
The error you got is due to an older asm version which is fixed for 1.10
in https://issues.apache.org/jira/browse/FLINK-13467 .

On 10/01/2020 15:58, KristoffSC wrote:

> Hi,
> Yangze Guo, Chesnay Schepler thank you very much for your answers.
>
> I have actually a funny setup.
> So I have a Flink Job module, generated from Flink's maven archetype.
> This module has all operators and Flink environment config and execution.
> This module is compiled by maven with "maven.compiler.target" set to 1.8
>
> However I'm using a 3rd party library that was compiled with java 11.
> In order to build my main Job module I have to use JDK 11, however I still
> have "maven.compiler.target" set to 1.8 there.
>
> As a result, I have a Flink job jar, that has classes from Java 8 and 11.
> Running javap -verbose proves it. All classes from Flink Job module are in
> Java 8.
>
> I can build Flink Job cluster image that is based on [1]. However i had to
> change base image from openjdk:8-jre-alpine to
> adoptopenjdk/openjdk11:jre-11.0.5_10-alpine plus remove installing
> libc6-compat.
>
> After rebuilding the docker image, Job cluster started and process messges.
>
> On original openjdk:8-jre-alpine it was unable to start due issues with
> loading classes from my 3rd party library (Unsupported major.minor version
> exception)
> So this seems to work.
>
>
> However if I would change "maven.compiler.target" to Java 11 in my Flink Job
> module, then Flink is unable to run the Job giving me this exception
>
> job-cluster_1  | Caused by: java.lang.UnsupportedOperationException
> job-cluster_1  |        at
> org.apache.flink.shaded.asm6.org.objectweb.asm.ClassVisitor.visitNestHostExperimental(ClassVisitor.java:158)
> job-cluster_1  |        at
> org.apache.flink.shaded.asm6.org.objectweb.asm.ClassReader.accept(ClassReader.java:541)
> job-cluster_1  |        at
> org.apache.flink.shaded.asm6.org.objectweb.asm.ClassReader.accept(ClassReader.java:391)
> job-cluster_1  |        at
> org.apache.flink.api.java.ClosureCleaner.cleanThis0(ClosureCleaner.java:187)
> job-cluster_1  |        at
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:100)
> job-cluster_1  |        at
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
> job-cluster_1  |        at
> org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:71)
> job-cluster_1  |        at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1574)
> job-cluster_1  |        at
> org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:185)
> job-cluster_1  |        at
> org.apache.flink.streaming.api.datastream.DataStream.process(DataStream.java:668)
> job-cluster_1  |        at
> org.apache.flink.streaming.api.datastream.DataStream.process(DataStream.java:645)
> job-cluster_1  |        at
> com.epam.monity.job.StreamJob.doTheJob(StreamJob.java:140)
> job-cluster_1  |        at
> com.epam.monity.job.StreamJob.main(StreamJob.java:46)
> job-cluster_1  |        at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> job-cluster_1  |        at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> job-cluster_1  |        at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> job-cluster_1  |        at java.base/java.lang.reflect.Method.invoke(Unknown
> Source)
> job-cluster_1  |        at
> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
> job-cluster_1  |        ... 13 more
>
>
>
> Long story short, seems that for now, Job module has to be compiled to 1.8
> with JDK 11 if Java 11 libraries are used.
>
> [1]
> https://github.com/apache/flink/blob/release-1.9/flink-container/docker/README.md
>
>
>
>
>
> --
> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
>