GithubHelp home page GithubHelp logo

alibaba / dragonwell17 Goto Github PK

View Code? Open in Web Editor NEW
274.0 14.0 39.0 987.93 MB

Alibaba Dragonwell17 JDK

Home Page: https://dragonwell-jdk.io/

License: GNU General Public License v2.0

Makefile 0.01% Shell 0.18% JavaScript 0.02% M4 0.14% sed 0.01% Roff 0.17% HTML 0.33% CSS 0.01% Python 0.05% Java 76.56% XSLT 0.25% Batchfile 0.01% Perl 0.01% C++ 13.52% C 5.92% DTrace 0.01% Assembly 2.42% GDB 0.01% Objective-C 0.42% Mathematica 0.01%

dragonwell17's Introduction

Dragonwell Logo

Introduction

Over the years, Java has proliferated in Alibaba. Many applications are written in Java and many our Java developers have written more than one billion lines of Java code.

Alibaba Dragonwell, as a downstream version of OpenJDK, is the OpenJDK implementation at Alibaba optimized for online e-commerce, financial, logistics applications running on 100,000+ servers. Alibaba Dragonwell is the engine that runs these distributed Java applications in extreme scaling.

Alibaba Dragonwell is clearly a "friendly fork" under the same licensing terms as the upstream OpenJDK project. Alibaba is committed to collaborate closely with OpenJDK community and intends to bring as many customized features as possible from Alibaba Dragonwell to the upstream.

Using Alibaba Dragonwell

Acknowledgement

Special thanks to those who have made contributions to Alibaba's internal JDK builds.

Publications

Technologies included in Alibaba Dragonwell have been published in following papers

dragonwell17's People

Contributors

chrishegarty avatar cl4es avatar coleenp avatar dfuch avatar erikj79 avatar goelin avatar hns avatar iignatev avatar jddarcy avatar jesperirl avatar jonathan-gibbons avatar lahodaj avatar magicus avatar mbaesken avatar mcimadamore avatar mrserb avatar naotoj avatar pliden avatar prrace avatar prsadhuk avatar rwestrel avatar seanjmullan avatar shipilev avatar stefank avatar sundararajana avatar therealmdoerr avatar tobihartmann avatar wangweij avatar xueleifan avatar zhengyu123 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dragonwell17's Issues

17.0.7.0.7+7-GA版本应该是standard而不是extended

sudo docker run --pids-limit -1 --network host --privileged -h docker -u $(id -u ${USER}):$(id -g ${USER}) -w $HOME -v /etc/group:/etc/group:ro -v /etc/passwd:/etc/passwd:ro -v /etc/shadow:/etc/shadow:ro --rm -it -v $HOME:$HOME --name jtreg -w $PWD anolis-registry.cn-zhangjiakou.cr.aliyuncs.com/openanolis/dragonwell:17-8.6 bash

image

【上游问题】aarch64平台-Xcomp选项运行TestMetaspaceParallelGCAllocationPendingStackTrace.java用例随机失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147441?tab=2

ACTION: main -- Failed. Execution failed: `main' threw exception: java.lang.AssertionError: No matching stack trace found
REASON: User specified action: run main/othervm -XX:+UseSerialGC -Xlog:gc* -XX:MaxMetaspaceSize=64M jdk.jfr.event.gc.stacktrace.TestMetaspaceSerialGCAllocationPendingStackTrace 
TIME:   38.078 seconds
messages:
command: main -XX:+UseSerialGC -Xlog:gc* -XX:MaxMetaspaceSize=64M jdk.jfr.event.gc.stacktrace.TestMetaspaceSerialGCAllocationPendingStackTrace
reason: User specified action: run main/othervm -XX:+UseSerialGC -Xlog:gc* -XX:MaxMetaspaceSize=64M jdk.jfr.event.gc.stacktrace.TestMetaspaceSerialGCAllocationPendingStackTrace 
Mode: othervm [/othervm specified]
Additional options from @modules: --add-modules jdk.jfr,java.management,jdk.management
elapsed time (seconds): 38.078
configuration:
Boot Layer
  add modules: jdk.jfr java.management jdk.management

STDOUT:
[0.205s][info][gc] Using Serial
[0.207s][info][gc,init] Version: 17.0.6+9 (release)
[0.207s][info][gc,init] CPUs: 64 total, 64 available
[0.207s][info][gc,init] Memory: 246G
[0.207s][info][gc,init] Large Page Support: Disabled
[0.207s][info][gc,init] NUMA Support: Disabled
[0.207s][info][gc,init] Compressed Oops: Enabled (Zero based)
[0.207s][info][gc,init] Heap Min Capacity: 8M
[0.207s][info][gc,init] Heap Initial Capacity: 2G
[0.207s][info][gc,init] Heap Max Capacity: 30718M
[0.207s][info][gc,init] Pre-touch: Disabled
[0.207s][info][gc,metaspace] CDS archive(s) mapped at: [0x0000000800000000-0x0000000800bd5000-0x0000000800bd5000), size 12406784, SharedBaseAddress: 0x0000000800000000, ArchiveRelocationMode: 0.
[0.207s][info][gc,metaspace] Compressed class space mapped at: 0x0000000800c00000-0x0000000804000000, reserved size: 54525952
[0.207s][info][gc,metaspace] Narrow klass base: 0x0000000800000000, Narrow klass shift: 0, Narrow klass range: 0x100000000
[25.998s][info][gc,start    ] GC(0) Pause Full (Metadata GC Threshold)
[25.998s][info][gc,phases,start] GC(0) Phase 1: Mark live objects
[26.020s][info][gc,phases      ] GC(0) Phase 1: Mark live objects 22.387ms
[26.020s][info][gc,phases,start] GC(0) Phase 2: Compute new object addresses
[26.032s][info][gc,phases      ] GC(0) Phase 2: Compute new object addresses 12.029ms
[26.032s][info][gc,phases,start] GC(0) Phase 3: Adjust pointers
[26.037s][info][gc,phases      ] GC(0) Phase 3: Adjust pointers 4.542ms
[26.037s][info][gc,phases,start] GC(0) Phase 4: Move objects
[26.039s][info][gc,phases      ] GC(0) Phase 4: Move objects 2.344ms
[26.049s][info][gc,heap        ] GC(0) DefNew: 134232K(629120K)->0K(629248K) Eden: 134232K(559232K)->0K(559360K) From: 0K(69888K)->0K(69888K)
[26.049s][info][gc,heap        ] GC(0) Tenured: 0K(1398144K)->4081K(1398144K)
[26.049s][info][gc,metaspace   ] GC(0) Metaspace: 15946K(21504K)->9792K(10240K) NonClass: 14770K(18496K)->9208K(9472K) Class: 1176K(3008K)->583K(768K)
[26.049s][info][gc             ] GC(0) Pause Full (Metadata GC Threshold) 131M->3M(1979M) 51.811ms
[26.049s][info][gc,cpu         ] GC(0) User=0.05s Sys=0.00s Real=0.05s
Allocation num: 1150
GC detected: 1
JFR GC events found: 1
Event: jdk.AllocationRequiringGC {
  startTime = 18:55:40.014
  gcId = 0
  size = 808 bytes
  eventThread = "C1 CompilerThread0" (javaThreadId = 19)
}


Attempt: 0 out of 5: no matching stack trace found.
[30.460s][info][gc,start       ] GC(1) Pause Full (Metadata GC Threshold)
[30.460s][info][gc,phases,start] GC(1) Phase 1: Mark live objects
[30.483s][info][gc,phases      ] GC(1) Phase 1: Mark live objects 23.096ms
[30.483s][info][gc,phases,start] GC(1) Phase 2: Compute new object addresses
[30.493s][info][gc,phases      ] GC(1) Phase 2: Compute new object addresses 9.428ms
[30.493s][info][gc,phases,start] GC(1) Phase 3: Adjust pointers
[30.498s][info][gc,phases      ] GC(1) Phase 3: Adjust pointers 4.758ms
[30.498s][info][gc,phases,start] GC(1) Phase 4: Move objects
[30.498s][info][gc,phases      ] GC(1) Phase 4: Move objects 0.731ms
[30.507s][info][gc,heap        ] GC(1) DefNew: 89499K(629248K)->0K(629248K) Eden: 89499K(559360K)->0K(559360K) From: 0K(69888K)->0K(69888K)
[30.507s][info][gc,heap        ] GC(1) Tenured: 4081K(1398144K)->5170K(1398144K)
[30.507s][info][gc,metaspace   ] GC(1) Metaspace: 16523K(21504K)->10992K(11456K) NonClass: 15334K(18688K)->10335K(10560K) Class: 1189K(2816K)->657K(896K)
[30.507s][info][gc             ] GC(1) Pause Full (Metadata GC Threshold) 91M->5M(1979M) 46.724ms
[30.507s][info][gc,cpu         ] GC(1) User=0.05s Sys=0.00s Real=0.05s
Allocation num: 1032
GC detected: 2
JFR GC events found: 1
Event: jdk.AllocationRequiringGC {
  startTime = 18:55:44.476
  gcId = 1
  size = 808 bytes
  eventThread = "C1 CompilerThread0" (javaThreadId = 19)
}


Attempt: 1 out of 5: no matching stack trace found.
[32.982s][info][gc,start       ] GC(2) Pause Full (Metadata GC Threshold)
[32.982s][info][gc,phases,start] GC(2) Phase 1: Mark live objects
[33.001s][info][gc,phases      ] GC(2) Phase 1: Mark live objects 18.602ms
[33.001s][info][gc,phases,start] GC(2) Phase 2: Compute new object addresses
[33.010s][info][gc,phases      ] GC(2) Phase 2: Compute new object addresses 9.217ms
[33.010s][info][gc,phases,start] GC(2) Phase 3: Adjust pointers
[33.015s][info][gc,phases      ] GC(2) Phase 3: Adjust pointers 4.449ms
[33.015s][info][gc,phases,start] GC(2) Phase 4: Move objects
[33.015s][info][gc,phases      ] GC(2) Phase 4: Move objects 0.377ms
[33.024s][info][gc,heap        ] GC(2) DefNew: 67128K(629248K)->0K(629248K) Eden: 67128K(559360K)->0K(559360K) From: 0K(69888K)->0K(69888K)
[33.024s][info][gc,heap        ] GC(2) Tenured: 5170K(1398144K)->6112K(1398144K)
[33.024s][info][gc,metaspace   ] GC(2) Metaspace: 16527K(21504K)->11001K(11456K) NonClass: 15337K(18688K)->10343K(10560K) Class: 1189K(2816K)->658K(896K)
[33.024s][info][gc             ] GC(2) Pause Full (Metadata GC Threshold) 70M->5M(1979M) 41.724ms
[33.024s][info][gc,cpu         ] GC(2) User=0.04s Sys=0.00s Real=0.04s
Allocation num: 1031
GC detected: 3
JFR GC events found: 1
Event: jdk.AllocationRequiringGC {
  startTime = 18:55:46.998
  gcId = 2
  size = 808 bytes
  eventThread = "C1 CompilerThread0" (javaThreadId = 19)
}


Attempt: 2 out of 5: no matching stack trace found.
[35.309s][info][gc,start       ] GC(3) Pause Full (Metadata GC Threshold)
[35.309s][info][gc,phases,start] GC(3) Phase 1: Mark live objects
[35.333s][info][gc,phases      ] GC(3) Phase 1: Mark live objects 24.564ms
[35.333s][info][gc,phases,start] GC(3) Phase 2: Compute new object addresses
[35.342s][info][gc,phases      ] GC(3) Phase 2: Compute new object addresses 9.113ms
[35.342s][info][gc,phases,start] GC(3) Phase 3: Adjust pointers
[35.347s][info][gc,phases      ] GC(3) Phase 3: Adjust pointers 4.841ms
[35.347s][info][gc,phases,start] GC(3) Phase 4: Move objects
[35.349s][info][gc,phases      ] GC(3) Phase 4: Move objects 1.330ms
[35.357s][info][gc,heap        ] GC(3) DefNew: 67126K(629248K)->0K(629248K) Eden: 67126K(559360K)->0K(559360K) From: 0K(69888K)->0K(69888K)
[35.357s][info][gc,heap        ] GC(3) Tenured: 6112K(1398144K)->4103K(1398144K)
[35.357s][info][gc,metaspace   ] GC(3) Metaspace: 16529K(21504K)->11003K(11456K) NonClass: 15339K(18688K)->10345K(10560K) Class: 1189K(2816K)->658K(896K)
[35.358s][info][gc             ] GC(3) Pause Full (Metadata GC Threshold) 71M->4M(1979M) 48.948ms
[35.358s][info][gc,cpu         ] GC(3) User=0.05s Sys=0.00s Real=0.05s
Allocation num: 1031
GC detected: 4
JFR GC events found: 1
Event: jdk.AllocationRequiringGC {
  startTime = 18:55:49.325
  gcId = 3
  size = 808 bytes
  eventThread = "C1 CompilerThread0" (javaThreadId = 19)
}


Attempt: 3 out of 5: no matching stack trace found.
[37.491s][info][gc,start       ] GC(4) Pause Full (Metadata GC Threshold)
[37.491s][info][gc,phases,start] GC(4) Phase 1: Mark live objects
[37.509s][info][gc,phases      ] GC(4) Phase 1: Mark live objects 18.261ms
[37.509s][info][gc,phases,start] GC(4) Phase 2: Compute new object addresses
[37.519s][info][gc,phases      ] GC(4) Phase 2: Compute new object addresses 9.323ms
[37.519s][info][gc,phases,start] GC(4) Phase 3: Adjust pointers
[37.523s][info][gc,phases      ] GC(4) Phase 3: Adjust pointers 4.517ms
[37.523s][info][gc,phases,start] GC(4) Phase 4: Move objects
[37.524s][info][gc,phases      ] GC(4) Phase 4: Move objects 0.411ms
[37.533s][info][gc,heap        ] GC(4) DefNew: 67123K(629248K)->0K(629248K) Eden: 67123K(559360K)->0K(559360K) From: 0K(69888K)->0K(69888K)
[37.533s][info][gc,heap        ] GC(4) Tenured: 4103K(1398144K)->5044K(1398144K)
[37.533s][info][gc,metaspace   ] GC(4) Metaspace: 16529K(21504K)->11004K(11456K) NonClass: 15340K(18688K)->10346K(10560K) Class: 1189K(2816K)->658K(896K)
[37.533s][info][gc             ] GC(4) Pause Full (Metadata GC Threshold) 69M->4M(1979M) 42.011ms
[37.533s][info][gc,cpu         ] GC(4) User=0.05s Sys=0.00s Real=0.04s
Allocation num: 1031
GC detected: 5
JFR GC events found: 1
Event: jdk.AllocationRequiringGC {
  startTime = 18:55:51.507
  gcId = 4
  size = 808 bytes
  eventThread = "C1 CompilerThread0" (javaThreadId = 19)
}


Attempt: 4 out of 5: no matching stack trace found.
[38.060s][info][gc,heap,exit   ] Heap
[38.060s][info][gc,heap,exit   ]  def new generation   total 629248K, used 33561K [0x0000000080200000, 0x00000000aacc0000, 0x0000000300150000)
[38.060s][info][gc,heap,exit   ]   eden space 559360K,   6% used [0x0000000080200000, 0x00000000822c66b8, 0x00000000a2440000)
[38.060s][info][gc,heap,exit   ]   from space 69888K,   0% used [0x00000000a2440000, 0x00000000a2440000, 0x00000000a6880000)
[38.060s][info][gc,heap,exit   ]   to   space 69888K,   0% used [0x00000000a6880000, 0x00000000a6880000, 0x00000000aacc0000)
[38.060s][info][gc,heap,exit   ]  tenured generation   total 1398144K, used 5044K [0x0000000300150000, 0x00000003556b0000, 0x0000000800000000)
[38.060s][info][gc,heap,exit   ]    the space 1398144K,   0% used [0x0000000300150000, 0x000000030063d088, 0x000000030063d200, 0x00000003556b0000)
[38.060s][info][gc,heap,exit   ]  Metaspace       used 11195K, committed 11584K, reserved 77824K
[38.060s][info][gc,heap,exit   ]   class space    used 674K, committed 896K, reserved 53248K
STDERR:
java.lang.AssertionError: No matching stack trace found
	at jdk.jfr.event.gc.stacktrace.AllocationStackTrace.testAllocEvent(AllocationStackTrace.java:350)
	at jdk.jfr.event.gc.stacktrace.AllocationStackTrace.testMetaspaceSerialGCAllocEvent(AllocationStackTrace.java:207)
	at jdk.jfr.event.gc.stacktrace.TestMetaspaceSerialGCAllocationPendingStackTrace.main(TestMetaspaceSerialGCAllocationPendingStackTrace.java:37)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

9.log
7.log
6.log
4.log
49.log
46.log
44.log
41.log
3.log
38.log
37.log
36.log
35.log
31.log
30.log
2.log
29.log
27.log
26.log
25.log
20.log
16.log
15.log
13.log
11.log
10.log

复现概率:
26/50

[upstream]aarch64平台-Xcomp选项运行gtest/AsyncLogGtest.java用例crash

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/148360?tab=2

Command line: [/tmp/tone/run/jtreg/test-images/hotspot/gtest/server/gtestLauncher -jdk /tmp/tone/run/jtreg/binary-download/dragonwell-17.0.6.0.6+9-GA --gtest_output=xml:test_result.xml --gtest_catch_exceptions=0 --gtest_filter=Log*Test* -Xlog:async]
[2023-02-22T17:50:56.679709251Z] Gathering output for process 3125097
[2023-02-22T17:50:59.723095475Z] Waiting for completion for process 3125097
[2023-02-22T17:50:59.762814954Z] Waiting for completion finished for process 3125097
Output and diagnostic info for process 3125097 was saved into 'pid-3125097-output.log'
Note: Google Test filter = Log*Test*
[==========] Running 39 tests from 4 test cases.
[----------] Global test environment set-up.
[----------] 7 tests from LogTest
[ RUN      ] LogTest.large_message_vm
[       OK ] LogTest.large_message_vm (1 ms)
[ RUN      ] LogTest.enabled_logtarget_vm
[       OK ] LogTest.enabled_logtarget_vm (0 ms)
[ RUN      ] LogTest.disabled_logtarget_vm
[       OK ] LogTest.disabled_logtarget_vm (1 ms)
[ RUN      ] LogTest.enabled_loghandle_vm
[       OK ] LogTest.enabled_loghandle_vm (0 ms)
[ RUN      ] LogTest.disabled_loghandle_vm
[       OK ] LogTest.disabled_loghandle_vm (1 ms)
[ RUN      ] LogTest.enabled_logtargethandle_vm
[       OK ] LogTest.enabled_logtargethandle_vm (0 ms)
[ RUN      ] LogTest.disabled_logtargethandle_vm
[       OK ] LogTest.disabled_logtargethandle_vm (0 ms)
[----------] 7 tests from LogTest (268 ms total)

[----------] 17 tests from LogConfigurationTest
[ RUN      ] LogConfigurationTest.describe_vm
[       OK ] LogConfigurationTest.describe_vm (1 ms)
[ RUN      ] LogConfigurationTest.update_output_vm
[       OK ] LogConfigurationTest.update_output_vm (0 ms)
[ RUN      ] LogConfigurationTest.add_new_output_vm
[       OK ] LogConfigurationTest.add_new_output_vm (1 ms)
[ RUN      ] LogConfigurationTest.disable_logging_vm
[       OK ] LogConfigurationTest.disable_logging_vm (0 ms)
[ RUN      ] LogConfigurationTest.disable_output_vm
[       OK ] LogConfigurationTest.disable_output_vm (0 ms)
[ RUN      ] LogConfigurationTest.reconfigure_decorators_vm
[       OK ] LogConfigurationTest.reconfigure_decorators_vm (1 ms)
[ RUN      ] LogConfigurationTest.reconfigure_decorators_MT_vm
[       OK ] LogConfigurationTest.reconfigure_decorators_MT_vm (1152 ms)
[ RUN      ] LogConfigurationTest.reconfigure_tags_MT_vm
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000000000000020, pid=3125097, tid=3125512
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6-internal+0-adhoc..jdk-repo, mixed mode, tiered, compressed oops, compressed class ptrs, g1 gc, linux-aarch64)
# Problematic frame:
# C  0x0000000000000020
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e" (or dumping to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/AsyncLogGtest/core.3125097)
#
# An error report file with more information is saved as:
# /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/AsyncLogGtest/hs_err_pid3125097.log
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#

[2023-02-22T17:51:00.120584575Z] Waiting for completion for process 3125097
[2023-02-22T17:51:00.120981324Z] Waiting for completion finished for process 3125097
STDERR:
java.lang.AssertionError: gtest execution failed; exit code = 134.
	at GTestWrapper.main(GTestWrapper.java:98)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

JavaTest Message: Test threw exception: java.lang.AssertionError: gtest execution failed; exit code = 134.
JavaTest Message: shutting down test

STATUS:Failed.`main' threw exception: java.lang.AssertionError: gtest execution failed; exit code = 134.
rerun:
cd /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/AsyncLogGtest && \
DISPLAY=:7 \
HOME=/home/testUserForTone \
LANG=C \
LC_CTYPE=C.UTF-8 \
PATH=/bin:/usr/bin:/usr/sbin \
TEST_IMAGE_DIR=/tmp/tone/run/jtreg/test-images \
LD_LIBRARY_PATH=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
CLASSPATH=/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/8/gtest/AsyncLogGtest.d:/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/gtest:/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/8/test/lib:/tmp/tone/run/jtreg/jdk-repo/test/lib:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar \
    /tmp/tone/run/jtreg/binary-download/dragonwell-17.0.6.0.6+9-GA/bin/java \
        -Dtest.vm.opts='-Xcomp -ea -esa' \
        -Dtest.tool.vm.opts='-J-Xcomp -J-ea -J-esa' \
        -Dtest.compiler.opts= \
        -Dtest.java.opts= \
        -Dtest.jdk=/tmp/tone/run/jtreg/binary-download/dragonwell-17.0.6.0.6+9-GA \
        -Dcompile.jdk=/tmp/tone/run/jtreg/binary-download/dragonwell-17.0.6.0.6+9-GA \
        -Dtest.timeout.factor=4.0 \
        -Dtest.nativepath=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        -Dtest.root=/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg \
        -Dtest.name=gtest/AsyncLogGtest.java \
        -Dtest.file=/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/gtest/AsyncLogGtest.java \
        -Dtest.src=/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/gtest \
        -Dtest.src.path=/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/gtest:/tmp/tone/run/jtreg/jdk-repo/test/lib \
        -Dtest.classes=/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/8/gtest/AsyncLogGtest.d \
        -Dtest.class.path=/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/8/gtest/AsyncLogGtest.d:/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/8/test/lib \
        -Dtest.modules='java.base/jdk.internal.misc java.xml' \
        --add-modules java.base,java.xml \
        --add-exports java.base/jdk.internal.misc=ALL-UNNAMED \
        -Xcomp \
        -ea \
        -esa \
        -Djava.library.path=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        com.sun.javatest.regtest.agent.MainWrapper /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/AsyncLogGtest.d/main.2.jta --gtest_filter=Log*Test* -Xlog:async

TEST RESULT: Failed. Execution failed: `main' threw exception: java.lang.AssertionError: gtest execution failed; exit code = 134.

Wrong assets on JDK 17.0.0 release?

Hi, the assets attached to the release do not look correct.

>curl -L -O https://github.com/alibaba/dragonwell17/releases/download/dragonwell-17.0.0%2B35_jdk-17-ga/Alibaba_Dragonwell_17.0.0.35_x64_linux.tar.gz
> tar zxf Alibaba_Dragonwell_17.0.0.35_x64_linux.tar.gz
> tree -L 3
.
└── jdk-17.0.0+35+35-test-image
    ├── Readme.txt
    ├── build-info.properties
    ├── hotspot
    │   └── jtreg
    ├── jdk
    │   ├── demos
    │   └── jtreg
    └── lib-test
        └── jtreg

8 directories, 2 files

It looks like the test image download from here.

This looks like the right binary.

macOS compile error: Build failure on macOS with Xcode 13.0 as vfork is deprecated

macOS 12
Xcode 13.0

/Users/gaoyuntao/app/UFM/dragonwell-17.0.1.0.1-12_jdk-17.0.1-ga/src/hotspot/os/posix/os_posix.cpp:1888:26: error: 'vfork' is deprecated: Use posix_spawn or fork [-Werror,-Wdeprecated-declarations]
pid = prefer_vfork ? ::vfork() : ::fork();
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/usr/include/unistd.h:604:1: note: 'vfork' has been explicitly marked deprecated here
__deprecated_msg("Use posix_spawn or fork")
^
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/usr/include/sys/cdefs.h:208:48: note: expanded from macro '__deprecated_msg'
#define __deprecated_msg(_msg) attribute((deprecated(_msg)))
^
1 error generated.
make[3]: *** [/Users/gaoyuntao/app/UFM/dragonwell-17.0.1.0.1-12_jdk-17.0.1-ga/build/macosx-x86_64-server-release/hotspot/variant-server/libjvm/objs/os_posix.o] Error 1
make[3]: *** Waiting for unfinished jobs....
make[2]: *** [hotspot-server-libs] Error 2

ERROR: Build failed for target 'images' in configuration 'macosx-x86_64-server-release' (exit code 2)

=== Output from failing command(s) repeated here ===

  • For target hotspot_variant-server_libjvm_objs_os_posix.o:
    /Users/gaoyuntao/app/UFM/dragonwell-17.0.1.0.1-12_jdk-17.0.1-ga/src/hotspot/os/posix/os_posix.cpp:1888:26: error: 'vfork' is deprecated: Use posix_spawn or fork [-Werror,-Wdeprecated-declarations]
    pid = prefer_vfork ? ::vfork() : ::fork();
    ^
    /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/usr/include/unistd.h:604:1: note: 'vfork' has been explicitly marked deprecated here
    __deprecated_msg("Use posix_spawn or fork")
    ^
    /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/usr/include/sys/cdefs.h:208:48: note: expanded from macro '__deprecated_msg'
    #define __deprecated_msg(_msg) attribute((deprecated(_msg)))
    ^
    1 error generated.

  • All command lines available in /Users/gaoyuntao/app/UFM/dragonwell-17.0.1.0.1-12_jdk-17.0.1-ga/build/macosx-x86_64-server-release/make-support/failure-logs.
    === End of repeated output ===

No indication of failed target found.
Hint: Try searching the build log for '] Error'.
Hint: See doc/building.html#troubleshooting for assistance.

发现有人在JDK17的bug中。已经在17.0.2修复,这边什么时候能同步这个修复??

https://bugs.openjdk.java.net/browse/JDK-8274293

【anolis容器镜像】aarch64平台dragonwell17在anolis容器内运行用例serviceability/sa/ClhsdbCDSCore.java报错Unsafe_PutInt

STDOUT:
Starting ClhsdbCDSCore test
Command line: [/opt/java/openjdk/bin/java -cp /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/12/serviceability/sa/ClhsdbCDSCore.d:/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/serviceability/sa:/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/12/test/lib:/tmp/tone/run/jtreg/jdk-repo/test/lib:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar -Xmixed -ea -esa -Xshare:dump -Xlog:cds,cds+hashtables -XX:SharedArchiveFile=ArchiveForClhsdbCDSCore.jsa ]
[2023-02-13T05:29:05.940980196Z] Gathering output for process 748336
[ELAPSED: 1564 ms]
[logging stdout to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/serviceability/sa/ClhsdbCDSCore/serviceability.sa.ClhsdbCDSCore.java-0000-dump.stdout]
[logging stderr to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/serviceability/sa/ClhsdbCDSCore/serviceability.sa.ClhsdbCDSCore.java-0000-dump.stderr]
[STDERR]

[2023-02-13T05:29:07.478673284Z] Waiting for completion for process 748336
[2023-02-13T05:29:07.478865718Z] Waiting for completion finished for process 748336
Command line: [/opt/java/openjdk/bin/java -cp /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/12/serviceability/sa/ClhsdbCDSCore.d:/tmp/tone/run/jtreg/jdk-repo/test/hotspot/jtreg/serviceability/sa:/tmp/tone/run/jtreg/jt-work/hotspot_jtreg/classes/12/test/lib:/tmp/tone/run/jtreg/jdk-repo/test/lib:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar -Xmixed -ea -esa -Xmx512m -XX:+UnlockDiagnosticVMOptions -XX:SharedArchiveFile=ArchiveForClhsdbCDSCore.jsa -XX:+CreateCoredumpOnCrash -Xshare:auto -XX:+ProfileInterpreter --add-exports=java.base/jdk.internal.misc=ALL-UNNAMED CrashApp ]
[2023-02-13T05:29:07.486493022Z] Gathering output for process 767538
[2023-02-13T05:29:07.487771326Z] Waiting for completion for process 767538
[2023-02-13T05:29:07.488139035Z] Waiting for completion finished for process 767538
Output and diagnostic info for process 767538 was saved into 'pid-767538-output.log'
[2023-02-13T05:29:07.492733325Z] Waiting for completion for process 767538
[2023-02-13T05:29:07.492840702Z] Waiting for completion finished for process 767538
Run test with ulimit -c: unlimited
[2023-02-13T05:29:07.494890224Z] Gathering output for process 767717
[2023-02-13T05:29:09.681019263Z] Waiting for completion for process 767717
[2023-02-13T05:29:09.681297315Z] Waiting for completion finished for process 767717
Output and diagnostic info for process 767717 was saved into 'pid-767717-output.log'
crashOutputString = [#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000ffffb5048988, pid=767717, tid=767816
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-aarch64)
# Problematic frame:
# V  [libjvm.so+0xda8988]  Unsafe_PutInt+0x118
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e" (or dumping to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/serviceability/sa/ClhsdbCDSCore/core.767717)
#
# An error report file with more information is saved as:
# /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/serviceability/sa/ClhsdbCDSCore/hs_err_pid767717.log
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#
]
getCoreFileLocation found stringWithLocation = Core dumps may be processed with "/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e" (or dumping to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/serviceability/sa/ClhsdbCDSCore/core.767717)
|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e
Running systemd-coredump: trying coredumpctl command
[2023-02-13T05:29:14.696980009Z] Gathering output for process 831150
[2023-02-13T05:29:14.714945699Z] Waiting for completion for process 831150
[2023-02-13T05:29:14.715286410Z] Waiting for completion finished for process 831150
Output and diagnostic info for process 831150 was saved into 'pid-831150-output.log'

【上游问题】jfr/event/gc/stacktrace/TestG1OldAllocationPendingStackTrace.java大概率随机超时

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/156791

Threads class SMR info:
_java_thread_list=0x0000ffff50001df0, length=14, elements={
0x0000ffff8c026ea0, 0x0000ffff8c18f060, 0x0000ffff8c1904d0, 0x0000ffff8c196000,
0x0000ffff8c1973c0, 0x0000ffff8c1987f0, 0x0000ffff8c199df0, 0x0000ffff8c19b270,
0x0000ffff8c1a74f0, 0x0000ffff8c1adad0, 0x0000ffff8c1f4400, 0x0000ffff30091200,
0x0000ffff301035f0, 0x0000ffff50000e10
}

"main" #1 prio=5 os_prio=0 cpu=74.28ms elapsed=480.21s tid=0x0000ffff8c026ea0 nid=0x12eefc in Object.wait()  [0x0000ffff9239e000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at java.lang.Thread.join([email protected]/Thread.java:1304)
	- locked <0x00000000f8096868> (a java.lang.Thread)
	at java.lang.Thread.join([email protected]/Thread.java:1372)
	at com.sun.javatest.regtest.agent.MainWrapper.main(MainWrapper.java:74)

"Reference Handler" #2 daemon prio=10 os_prio=0 cpu=0.79ms elapsed=480.20s tid=0x0000ffff8c18f060 nid=0x12ef03 waiting on condition  [0x0000ffff6d9fc000]
   java.lang.Thread.State: RUNNABLE
	at java.lang.ref.Reference.waitForReferencePendingList([email protected]/Native Method)
	at java.lang.ref.Reference.processPendingReferences([email protected]/Reference.java:253)
	at java.lang.ref.Reference$ReferenceHandler.run([email protected]/Reference.java:215)

"Finalizer" #3 daemon prio=8 os_prio=0 cpu=0.20ms elapsed=480.20s tid=0x0000ffff8c1904d0 nid=0x12ef04 in Object.wait()  [0x0000ffff6d7fc000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <0x00000000f809bd18> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
	- locked <0x00000000f809bd18> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:176)
	at java.lang.ref.Finalizer$FinalizerThread.run([email protected]/Finalizer.java:172)

"Signal Dispatcher" #4 daemon prio=9 os_prio=0 cpu=0.23ms elapsed=480.20s tid=0x0000ffff8c196000 nid=0x12ef05 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #5 daemon prio=9 os_prio=0 cpu=2.05ms elapsed=480.20s tid=0x0000ffff8c1973c0 nid=0x12ef06 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Monitor Deflation Thread" #6 daemon prio=9 os_prio=0 cpu=4.93ms elapsed=480.20s tid=0x0000ffff8c1987f0 nid=0x12ef07 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C1 CompilerThread0" #7 daemon prio=9 os_prio=0 cpu=2060.89ms elapsed=480.20s tid=0x0000ffff8c199df0 nid=0x12ef08 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"Sweeper thread" #25 daemon prio=9 os_prio=0 cpu=92.23ms elapsed=480.20s tid=0x0000ffff8c19b270 nid=0x12ef09 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Common-Cleaner" #26 daemon prio=8 os_prio=0 cpu=0.74ms elapsed=479.85s tid=0x0000ffff8c1a74f0 nid=0x12ef0a in Object.wait()  [0x0000ffff6cbfc000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
	- locked <0x00000000f809bed0> (a java.lang.ref.ReferenceQueue$Lock)
	at jdk.internal.ref.CleanerImpl.run([email protected]/CleanerImpl.java:140)
	at java.lang.Thread.run([email protected]/Thread.java:833)
	at jdk.internal.misc.InnocuousThread.run([email protected]/InnocuousThread.java:162)

"Notification Thread" #27 daemon prio=9 os_prio=0 cpu=0.05ms elapsed=479.76s tid=0x0000ffff8c1adad0 nid=0x12ef0b runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"MainThread" #29 prio=5 os_prio=0 cpu=274604.26ms elapsed=479.54s tid=0x0000ffff8c1f4400 nid=0x12ef0d runnable  [0x0000ffff6c5f9000]
   java.lang.Thread.State: RUNNABLE
	at jdk.jfr.event.gc.stacktrace.OldGenMemoryAllocator.allocate(AllocationStackTrace.java:87)
	at jdk.jfr.event.gc.stacktrace.AllocationStackTrace.allocAndCheck(AllocationStackTrace.java:377)
	at jdk.jfr.event.gc.stacktrace.AllocationStackTrace.testAllocEvent(AllocationStackTrace.java:343)
	at jdk.jfr.event.gc.stacktrace.AllocationStackTrace.testG1OldAllocEvent(AllocationStackTrace.java:283)
	at jdk.jfr.event.gc.stacktrace.TestG1OldAllocationPendingStackTrace.main(TestG1OldAllocationPendingStackTrace.java:37)
	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0([email protected]/Native Method)
	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke([email protected]/NativeMethodAccessorImpl.java:77)
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke([email protected]/DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke([email protected]/Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.lang.Thread.run([email protected]/Thread.java:833)

"JFR Recorder Thread" #31 daemon prio=5 os_prio=0 cpu=28.51ms elapsed=478.59s tid=0x0000ffff30091200 nid=0x12ef17 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"JFR Periodic Tasks" #32 daemon prio=5 os_prio=0 cpu=17.72ms elapsed=478.01s tid=0x0000ffff301035f0 nid=0x12ef1b in Object.wait()  [0x0000ffff36d63000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at jdk.jfr.internal.PlatformRecorder.takeNap([email protected]/PlatformRecorder.java:527)
	- locked <0x00000000f8270b58> (a java.lang.Object)
	at jdk.jfr.internal.PlatformRecorder.periodicTask([email protected]/PlatformRecorder.java:508)
	at jdk.jfr.internal.PlatformRecorder.lambda$startDiskMonitor$1([email protected]/PlatformRecorder.java:448)
	at jdk.jfr.internal.PlatformRecorder$$Lambda$103/0x0000000800c33d48.run([email protected]/Unknown Source)
	at java.lang.Thread.run([email protected]/Thread.java:833)

testcase-rerun.log
testcase-rerun-5.log
testcase-rerun-4.log
testcase-rerun-3.log
testcase-rerun-2.log
testcase-rerun-1.log
jtreg-result.log

Is Dragonwell now a fork of AdoptOpenJDK?

README.md from this repository says:

Alibaba Dragonwell is clearly a "friendly fork" under the same licensing terms as the upstream OpenJDK project.

But I see tags that only exists in the adoptopenjdk repo, see https://github.com/alibaba/dragonwell17/releases/tag/jdk-17.0.1%2B12_adopt

So

  • Is this official that Dragonwell is now a direct fork of the AdoptOpenJDK project?
  • May I have an idea of what else AdoptOpenJDK stuff was included in Dragonwell? maybe not only source code level, but also binaries artifacts.

Thanks

【上游问题】-Xcomp -XX:TieredStopAtLevel=1选项运行用例java/foreign/TestUpcall.java随机crash

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147439?tab=1

[10.626s][warning][codecache] CodeCache is full. Compiler has been disabled.
[10.627s][warning][codecache] Try increasing the code cache size using -XX:ReservedCodeCacheSize=
CodeCache: size=49152Kb used=49150Kb max_used=49150Kb free=1Kb
 bounds [0x00007f9a0947d000, 0x00007f9a0c47d000, 0x00007f9a0c47d000]
 total_blobs=28260 nmethods=24069 adapters=4117
 compilation: disabled (not enough contiguous free space left)
              stopped_count=1, restarted_count=0
 full_count=0
#
# There is insufficient memory for the Java Runtime Environment to continue.
# CodeCache: no room for upcall_stub_linkToNative
# An error report file with more information is saved as:
# /tmp/tone/run/jtreg/jt-work/test_jdk/java/foreign/TestUpcall/hs_err_pid2529223.log
STDERR:
WARNING: Using incubator modules: jdk.incubator.foreign
OpenJDK 64-Bit Server VM warning: CodeCache is full. Compiler has been disabled.
OpenJDK 64-Bit Server VM warning: Try increasing the code cache size using -XX:ReservedCodeCacheSize=
j -nativepath:build/linux-x86_64-server-release/images/test/jdk/jtreg/native -Xcomp -XX:TieredStopAtLevel=1 test/jdk/java/foreign/TestUpcall.java
[6.976s][warning][codecache] CodeCache is full. Compiler has been disabled.
[6.976s][warning][codecache] Try increasing the code cache size using -XX:ReservedCodeCacheSize=
CodeCache: size=49152Kb used=49151Kb max_used=49151Kb free=0Kb
 bounds [0x00007ff69147d000, 0x00007ff69447d000, 0x00007ff69447d000]
 total_blobs=28467 nmethods=24194 adapters=4196
 compilation: disabled (not enough contiguous free space left)
              stopped_count=1, restarted_count=0
 full_count=0
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007ff6a423041c, pid=2612224, tid=2612245
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9, compiled mode, emulated-client, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V  [libjvm.so+0xf3641c]  ProgrammableUpcallHandler::generate_upcall_stub(_jobject*, _jobject*, _jobject*)+0x10ac
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -- %E" (or dumping to /home/yansendao/git/jdk17u/tmp/scratch/core.2612224)
#
# An error report file with more information is saved as:
# /home/yansendao/git/jdk17u/tmp/scratch/hs_err_pid2612224.log
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#

hs_err_pid2613115.log
result.log

[Wisp] Port JKU coroutine support to JDK17.

Summary:
Following features are included in this patch

Test Plan:
test/hotspot/jtreg/runtime/coroutine/MemLeakTest.java
test/hotspot/jtreg/runtime/coroutine/AvoidDeoptCoroutineMethodTest.java

Dragonwell 17 Release plan

Prepare

  • add check.yml
  • configure version Number

Platforms

  • linux-x64
  • linux-alpine-x64
  • linux-aarch64
  • windows-x64

Build Jobs

http://ci.dragonwell-jdk.io/job/build-scripts/job/jobs/job/jdk17/

已完成 需要升级到vs2019.
http://ci.dragonwell-jdk.io/job/build-scripts/job/jobs/job/jdk17/job/jdk17-windows-x64-dragonwell/

10:55:36  openjdk version "17" 2021-09-14
10:55:36  OpenJDK Runtime Environment (Alibaba Dragonwell)-17.0.0+35 (build 17+35)
10:55:36  OpenJDK 64-Bit Server VM (Alibaba Dragonwell)-17.0.0+35 (build 17+35, mixed mode, sharing)

Test Results

http://ci.dragonwell-jdk.io/job/build-scripts/job/openjdk17-pipeline/

公开测试

####Linux X86
[Jtreg]
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.openjdk_x86-64_linux/3/testReport/ 通过
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_extended.openjdk_x86-64_linux/2/testReport/ 通过
[System]
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.system_x86-64_linux/23/
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_extended.system_x86-64_linux/17/

LinuxArm

[Jtreg]
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.openjdk_aarch64_linux/16/testReport/
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.openjdk_aarch64_linux/16/testReport/
部分失败主要是foreign API相关

[System]
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.system_aarch64_linux/
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_extended.system_aarch64_linux/

Windows

http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_sanity.openjdk_x86-64_windows/2/testReport/ 通过

内部测试

specjbb2015通过
**

其他平台

Release Contents

Docker

OSS

Done

Github Artifacts

https://github.com/alibaba/dragonwell17/releases/tag/dragonwell-17.0.0%2B35_jdk-17-ga

  • Check

RPM in Alibaba Linux

[upstream]gtest/LargePageGtests.java#use-large-pages-1G随机失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/152298?tab=2

A
Range [7f7b99144000-7f7b9914a000) contains: 
7f7b99144000-7f7b99145000 rwxp 00000000 00:00 0 
7f7b99145000-7f7b99146000 rw-p 00000000 00:00 0 
7f7b99146000-7f7b99147000 rwxp 00000000 00:00 0 
7f7b99147000-7f7b99148000 rw-p 00000000 00:00 0 
7f7b99148000-7f7b99149000 rwxp 00000000 00:00 0 
7f7b99149000-7f7b9914a000 rw-p 00000000 00:00 0 

B
Range [7f7b99144000-7f7b9914a000) contains: 
7f7b99144000-7f7b99145000 rwxp 00000000 00:00 0 
7f7b99149000-7f7b9914a000 rw-p 00000000 00:00 0 

C
Range [7f7b99144000-7f7b9914a000) contains: 
7f7b99144000-7f7b99145000 rwxp 00000000 00:00 0 
7f7b99145000-7f7b99149000 ---p 00000000 00:00 0 
7f7b99149000-7f7b9914a000 rw-p 00000000 00:00 0 

[       OK ] os.release_multi_mappings_vm (1 ms)
[ RUN      ] os.release_one_mapping_multi_commits_vm
A
Range [7f7b19d7b000-7f7b1ad7b000) contains: 
7f7b19d7b000-7f7b1a17b000 rw-p 00000000 00:00 0 
7f7b1a17b000-7f7b1a57b000 ---p 00000000 00:00 0 
7f7b1a57b000-7f7b1a97b000 rw-p 00000000 00:00 0 
7f7b1a97b000-7f7b1ad7b000 ---p 00000000 00:00 0 

B
Range [7f7b19d7b000-7f7b1ad7b000) contains: 
nothing.

test/hotspot/gtest/runtime/test_os.cpp:528: Failure
Expected equality of these values:
  p2
    Which is: NULL
  p
    Which is: 0x7f7b19d7b000
[  FAILED  ] os.release_one_mapping_multi_commits_vm (1 ms)

java/security/cert/pkix/policyChanges/TestPolicy.java和java/security/cert/CertPathBuilder/targetConstraints/BuildEEBasicConstraints.java两个用例由于证书过期用例失败

1、Jenkins链接:
http://ci.dragonwell-jdk.io/job/Test_openjdk17_dragonwell_extended.openjdk_x86-64_linux/32/consoleFull
2、执行机:
47.242.32.229
3、执行机目录:
/home/testuser/jenkins/workspace/Test_openjdk17_dragonwell_extended.openjdk_x86-64_linux
4、失败用例:
a、
./aqa-tests/openjdk/openjdk-jdk/test/jdk/java/security/cert/pkix/policyChanges/TestPolicy.java
环境复现日志:

b、
java/security/cert/CertPathBuilder/targetConstraints/BuildEEBasicConstraints.java
环境复现日志:

【上游问题】renaissance子项finagle-chirper随机报错java.lang.InternalError: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/158865?tab=2

====== finagle-chirper (web) [default], iteration 0 started ======
Resetting master, feed map size: 5000
OpenJDK 64-Bit Server VM warning: Try increasing the code heap size using -XX:NonProfiledCodeHeapSize=
23/03/21 15:23:36 ERROR monitor: VM error
java.lang.InternalError: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic(Object,Object,Object,Object,long,Object,Object,MemberName)Object/invokeStatic
	at java.base/java.lang.invoke.MethodHandleStatics.newInternalError(MethodHandleStatics.java:155)
	at java.base/java.lang.invoke.DirectMethodHandle.makePreparedLambdaForm(DirectMethodHandle.java:266)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:233)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:218)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:227)
	at java.base/java.lang.invoke.DirectMethodHandle.make(DirectMethodHandle.java:108)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodCommon(MethodHandles.java:4004)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodNoSecurityManager(MethodHandles.java:3960)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodForConstant(MethodHandles.java:4204)
	at java.base/java.lang.invoke.MethodHandles$Lookup.linkMethodHandleConstant(MethodHandles.java:4152)
	at java.base/java.lang.invoke.MethodHandleNatives.linkMethodHandleConstant(MethodHandleNatives.java:615)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.updateClosedStream(StreamingStatsFilter.scala:156)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.$anonfun$apply$1(StreamingStatsFilter.scala:142)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.$anonfun$apply$1$adapted(StreamingStatsFilter.scala:131)
	at com.twitter.util.Promise$Monitored.apply(Promise.scala:219)
	at com.twitter.util.Promise$WaitQueue.run(Promise.scala:104)
	at com.twitter.util.Promise$WaitQueue.$anonfun$runInScheduler$1(Promise.scala:97)
	at com.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:167)
	at com.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:126)
	at com.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:243)
	at com.twitter.concurrent.Scheduler$.submit(Scheduler.scala:78)
	at com.twitter.util.Promise$WaitQueue.runInScheduler(Promise.scala:97)
	at com.twitter.util.Promise.updateIfEmpty(Promise.scala:808)
	at com.twitter.util.Promise.update(Promise.scala:780)
	at com.twitter.util.Promise.setValue(Promise.scala:756)
	at com.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:121)
	at com.twitter.finagle.netty4.transport.ChannelTransport$$anon$2.channelRead(ChannelTransport.scala:169)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.netty4.channel.ChannelRequestStatsHandler.channelRead(ChannelRequestStatsHandler.scala:48)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.UnpoolHttpHandler$.channelRead(UnpoolHttpHandler.scala:32)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.BadRequestHandler.channelRead(BadRequestHandler.scala:42)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.HeaderValidatorHandler$.channelRead(HeaderValidatorHandler.scala:51)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.UriValidatorHandler$.channelRead(UriValidatorHandler.scala:30)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.http2.transport.common.Http2StreamMessageHandler.channelRead(Http2StreamMessageHandler.scala:76)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.http2.transport.common.StripHeadersHandler$.channelRead(StripHeadersHandler.scala:24)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.http2.transport.server.H2UriValidatorHandler$.channelRead(H2UriValidatorHandler.scala:36)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.handler.codec.http2.AbstractHttp2StreamChannel$Http2ChannelUnsafe.doRead0(AbstractHttp2StreamChannel.java:901)
	at io.netty.handler.codec.http2.AbstractHttp2StreamChannel.fireChildRead(AbstractHttp2StreamChannel.java:555)
	at io.netty.handler.codec.http2.Http2MultiplexHandler.channelRead(Http2MultiplexHandler.java:180)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.http2.Http2FrameCodec.onHttp2Frame(Http2FrameCodec.java:707)
	at io.netty.handler.codec.http2.Http2FrameCodec$FrameListener.onHeadersRead(Http2FrameCodec.java:639)
	at io.netty.handler.codec.http2.InboundHttpToHttp2Adapter.handle(InboundHttpToHttp2Adapter.java:67)
	at io.netty.handler.codec.http2.Http2FrameCodec.userEventTriggered(Http2FrameCodec.java:282)
	at io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:346)
	at io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:332)
	at io.netty.channel.AbstractChannelHandlerContext.fireUserEventTriggered(AbstractChannelHandlerContext.java:324)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.upgrade(HttpServerUpgradeHandler.java:376)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.decode(HttpServerUpgradeHandler.java:266)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.decode(HttpServerUpgradeHandler.java:40)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
	at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.http2.transport.server.PriorKnowledgeHandler.channelRead(PriorKnowledgeHandler.scala:71)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.netty4.channel.ChannelStatsHandler.channelRead(ChannelStatsHandler.scala:156)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
	at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
	at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic(Object,Object,Object,Object,long,Object,Object,MemberName)Object/invokeStatic
	at java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:976)
	at java.base/java.lang.invoke.MemberName$Factory.resolveOrFail(MemberName.java:1117)
	at java.base/java.lang.invoke.DirectMethodHandle.makePreparedLambdaForm(DirectMethodHandle.java:263)
	... 135 more
Caused by: java.lang.NoSuchMethodError: 'java.lang.Object java.lang.invoke.MethodHandle.linkToStatic(java.lang.Object, java.lang.Object, java.lang.Object, java.lang.Object, long, java.lang.Object, java.lang.Object, java.lang.invoke.MemberName)'
	at java.base/java.lang.invoke.MethodHandleNatives.resolve(Native Method)
	at java.base/java.lang.invoke.MemberName$Factory.resolve(MemberName.java:1085)
	at java.base/java.lang.invoke.MemberName$Factory.resolveOrFail(MemberName.java:1114)
	... 136 more
Caused by: java.lang.VirtualMachineError: Out of space in CodeCache for method handle intrinsic
	... 139 more
VM error: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic(Object,Object,Object,Object,long,Object,Object,MemberName)Object/invokeStatic
java.lang.InternalError: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic(Object,Object,Object,Object,long,Object,Object,MemberName)Object/invokeStatic
	at java.base/java.lang.invoke.MethodHandleStatics.newInternalError(MethodHandleStatics.java:155)
	at java.base/java.lang.invoke.DirectMethodHandle.makePreparedLambdaForm(DirectMethodHandle.java:266)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:233)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:218)
	at java.base/java.lang.invoke.DirectMethodHandle.preparedLambdaForm(DirectMethodHandle.java:227)
	at java.base/java.lang.invoke.DirectMethodHandle.make(DirectMethodHandle.java:108)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodCommon(MethodHandles.java:4004)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodNoSecurityManager(MethodHandles.java:3960)
	at java.base/java.lang.invoke.MethodHandles$Lookup.getDirectMethodForConstant(MethodHandles.java:4204)
	at java.base/java.lang.invoke.MethodHandles$Lookup.linkMethodHandleConstant(MethodHandles.java:4152)
	at java.base/java.lang.invoke.MethodHandleNatives.linkMethodHandleConstant(MethodHandleNatives.java:615)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.updateClosedStream(StreamingStatsFilter.scala:156)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.$anonfun$apply$1(StreamingStatsFilter.scala:142)
	at com.twitter.finagle.http.filter.StreamingStatsFilter.$anonfun$apply$1$adapted(StreamingStatsFilter.scala:131)
	at com.twitter.util.Promise$Monitored.apply(Promise.scala:219)
	at com.twitter.util.Promise$WaitQueue.run(Promise.scala:104)
	at com.twitter.util.Promise$WaitQueue.$anonfun$runInScheduler$1(Promise.scala:97)
	at com.twitter.concurrent.LocalScheduler$Activation.run(Scheduler.scala:167)
	at com.twitter.concurrent.LocalScheduler$Activation.submit(Scheduler.scala:126)
	at com.twitter.concurrent.LocalScheduler.submit(Scheduler.scala:243)
	at com.twitter.concurrent.Scheduler$.submit(Scheduler.scala:78)
	at com.twitter.util.Promise$WaitQueue.runInScheduler(Promise.scala:97)
	at com.twitter.util.Promise.updateIfEmpty(Promise.scala:808)
	at com.twitter.util.Promise.update(Promise.scala:780)
	at com.twitter.util.Promise.setValue(Promise.scala:756)
	at com.twitter.concurrent.AsyncQueue.offer(AsyncQueue.scala:121)
	at com.twitter.finagle.netty4.transport.ChannelTransport$$anon$2.channelRead(ChannelTransport.scala:169)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.netty4.channel.ChannelRequestStatsHandler.channelRead(ChannelRequestStatsHandler.scala:48)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.UnpoolHttpHandler$.channelRead(UnpoolHttpHandler.scala:32)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.BadRequestHandler.channelRead(BadRequestHandler.scala:42)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.HeaderValidatorHandler$.channelRead(HeaderValidatorHandler.scala:51)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.netty4.http.handler.UriValidatorHandler$.channelRead(UriValidatorHandler.scala:30)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.http2.transport.common.Http2StreamMessageHandler.channelRead(Http2StreamMessageHandler.scala:76)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.http2.transport.common.StripHeadersHandler$.channelRead(StripHeadersHandler.scala:24)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.http2.transport.server.H2UriValidatorHandler$.channelRead(H2UriValidatorHandler.scala:36)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.handler.codec.http2.AbstractHttp2StreamChannel$Http2ChannelUnsafe.doRead0(AbstractHttp2StreamChannel.java:901)
	at io.netty.handler.codec.http2.AbstractHttp2StreamChannel.fireChildRead(AbstractHttp2StreamChannel.java:555)
	at io.netty.handler.codec.http2.Http2MultiplexHandler.channelRead(Http2MultiplexHandler.java:180)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.handler.codec.http2.Http2FrameCodec.onHttp2Frame(Http2FrameCodec.java:707)
	at io.netty.handler.codec.http2.Http2FrameCodec$FrameListener.onHeadersRead(Http2FrameCodec.java:639)
	at io.netty.handler.codec.http2.InboundHttpToHttp2Adapter.handle(InboundHttpToHttp2Adapter.java:67)
	at io.netty.handler.codec.http2.Http2FrameCodec.userEventTriggered(Http2FrameCodec.java:282)
	at io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:346)
	at io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:332)
	at io.netty.channel.AbstractChannelHandlerContext.fireUserEventTriggered(AbstractChannelHandlerContext.java:324)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.upgrade(HttpServerUpgradeHandler.java:376)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.decode(HttpServerUpgradeHandler.java:266)
	at io.netty.handler.codec.http.HttpServerUpgradeHandler.decode(HttpServerUpgradeHandler.java:40)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
	at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at com.twitter.finagle.http2.transport.server.PriorKnowledgeHandler.channelRead(PriorKnowledgeHandler.scala:71)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.twitter.finagle.netty4.channel.ChannelStatsHandler.channelRead(ChannelStatsHandler.scala:156)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:795)
	at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:480)
	at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:378)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.NoSuchMethodException: no such method: java.lang.invoke.MethodHandle.linkToStatic(Object,Object,Object,Object,long,Object,Object,MemberName)Object/invokeStatic
	at java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:976)
	at java.base/java.lang.invoke.MemberName$Factory.resolveOrFail(MemberName.java:1117)
	at java.base/java.lang.invoke.DirectMethodHandle.makePreparedLambdaForm(DirectMethodHandle.java:263)
	... 135 more
Caused by: java.lang.NoSuchMethodError: 'java.lang.Object java.lang.invoke.MethodHandle.linkToStatic(java.lang.Object, java.lang.Object, java.lang.Object, java.lang.Object, long, java.lang.Object, java.lang.Object, java.lang.invoke.MemberName)'
	at java.base/java.lang.invoke.MethodHandleNatives.resolve(Native Method)
	at java.base/java.lang.invoke.MemberName$Factory.resolve(MemberName.java:1085)
	at java.base/java.lang.invoke.MemberName$Factory.resolveOrFail(MemberName.java:1114)
	... 136 more
Caused by: java.lang.VirtualMachineError: Out of space in CodeCache for method handle intrinsic
	... 139 more

sun/security/provider/SecureRandom/AbstractDrbg/SpecTest.java intermittently timeout

Describe the bug
sun/security/provider/SecureRandom/AbstractDrbg/SpecTest.java intermittently fail. When the /dev/random is empty, then this testcase will run timeout

To Reproduce

  1. make /proc/sys/kernel/random/entropy_avail small than 1000
wc -l /dev/random
cat /proc/sys/kernel/random/entropy_avail

image

  1. run the test
rm -rf tmp/ ; jtreg -nr -v:fail,error -w tmp -timeout:0.1 test/jdk/sun/security/provider/SecureRandom/AbstractDrbg/SpecTest.java

result

ACTION: main -- Error. Program `/home/yansendao/software/jdk/temurin/jdk-11.0.17+8/bin/java' timed out (timeout set to 12000ms, elapsed time including timeout handling was 12400ms).
REASON: User specified action: run main SpecTest 
TIME:   12.403 seconds
messages:
command: main SpecTest
reason: User specified action: run main SpecTest 
Mode: othervm [test or library overrides a system module]
Additional options from @modules: --add-modules java.base --add-exports java.base/sun.security.provider=ALL-UNNAMED
Timeout information:
Running jstack on process 807522
2022-11-03 18:08:10
Full thread dump OpenJDK 64-Bit Server VM (11.0.17+8 mixed mode):

Threads class SMR info:
_java_thread_list=0x0000fffef4001ec0, length=12, elements={
0x0000ffff84028800, 0x0000ffff844ef800, 0x0000ffff844f3800, 0x0000ffff84508800,
0x0000ffff8450a800, 0x0000ffff8450d000, 0x0000ffff8450f000, 0x0000ffff84511000,
0x0000ffff8454e800, 0x0000fffef006d000, 0x0000ffff8463a000, 0x0000fffef4001000
}

"main" #1 prio=5 os_prio=0 cpu=141.21ms elapsed=12.37s tid=0x0000ffff84028800 nid=0xc5264 in Object.wait()  [0x0000ffff8b5af000]
   java.lang.Thread.State: WAITING (on object monitor)
        at java.lang.Object.wait([email protected]/Native Method)
        - waiting on <0x0000000101a0d398> (a java.lang.Thread)
        at java.lang.Thread.join([email protected]/Thread.java:1300)
        - waiting to re-lock in wait() <0x0000000101a0d398> (a java.lang.Thread)
        at java.lang.Thread.join([email protected]/Thread.java:1375)
        at com.sun.javatest.regtest.agent.MainWrapper.main(MainWrapper.java:74)

"Reference Handler" #2 daemon prio=10 os_prio=0 cpu=0.18ms elapsed=12.35s tid=0x0000ffff844ef800 nid=0xc526c waiting on condition  [0x0000ffff12ef5000]
   java.lang.Thread.State: RUNNABLE
        at java.lang.ref.Reference.waitForReferencePendingList([email protected]/Native Method)
        at java.lang.ref.Reference.processPendingReferences([email protected]/Reference.java:241)
        at java.lang.ref.Reference$ReferenceHandler.run([email protected]/Reference.java:213)

"Finalizer" #3 daemon prio=8 os_prio=0 cpu=0.48ms elapsed=12.35s tid=0x0000ffff844f3800 nid=0xc526d in Object.wait()  [0x0000ffff12cf5000]
   java.lang.Thread.State: WAITING (on object monitor)
        at java.lang.Object.wait([email protected]/Native Method)
        - waiting on <0x0000000101c09008> (a java.lang.ref.ReferenceQueue$Lock)
        at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
        - waiting to re-lock in wait() <0x0000000101c09008> (a java.lang.ref.ReferenceQueue$Lock)
        at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:176)
        at java.lang.ref.Finalizer$FinalizerThread.run([email protected]/Finalizer.java:170)

"Signal Dispatcher" #4 daemon prio=9 os_prio=0 cpu=0.33ms elapsed=12.34s tid=0x0000ffff84508800 nid=0xc526e runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #5 daemon prio=9 os_prio=0 cpu=0.09ms elapsed=12.34s tid=0x0000ffff8450a800 nid=0xc526f runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #6 daemon prio=9 os_prio=0 cpu=45.74ms elapsed=12.34s tid=0x0000ffff8450d000 nid=0xc5270 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"C1 CompilerThread0" #18 daemon prio=9 os_prio=0 cpu=49.63ms elapsed=12.34s tid=0x0000ffff8450f000 nid=0xc5271 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"Sweeper thread" #24 daemon prio=9 os_prio=0 cpu=0.11ms elapsed=12.34s tid=0x0000ffff84511000 nid=0xc5272 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Common-Cleaner" #25 daemon prio=8 os_prio=0 cpu=0.16ms elapsed=12.32s tid=0x0000ffff8454e800 nid=0xc5273 in Object.wait()  [0x0000ffff11b85000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
        at java.lang.Object.wait([email protected]/Native Method)
        - waiting on <0x0000000101c449d8> (a java.lang.ref.ReferenceQueue$Lock)
        at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
        - waiting to re-lock in wait() <0x0000000101c449d8> (a java.lang.ref.ReferenceQueue$Lock)
        at jdk.internal.ref.CleanerImpl.run([email protected]/CleanerImpl.java:148)
        at java.lang.Thread.run([email protected]/Thread.java:829)
        at jdk.internal.misc.InnocuousThread.run([email protected]/InnocuousThread.java:161)

"C1 CompilerThread1" #19 daemon prio=9 os_prio=0 cpu=35.85ms elapsed=12.29s tid=0x0000fffef006d000 nid=0xc5274 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"MainThread" #27 prio=5 os_prio=0 cpu=44.95ms elapsed=12.23s tid=0x0000ffff8463a000 nid=0xc5276 runnable  [0x0000ffff11582000]
   java.lang.Thread.State: RUNNABLE
        at java.io.FileInputStream.readBytes([email protected]/Native Method)
        at java.io.FileInputStream.read([email protected]/FileInputStream.java:279)
        at java.io.FilterInputStream.read([email protected]/FilterInputStream.java:133)
        at sun.security.provider.SeedGenerator$URLSeedGenerator.getSeedBytes([email protected]/SeedGenerator.java:541)
        at sun.security.provider.SeedGenerator.generateSeed([email protected]/SeedGenerator.java:144)
        at sun.security.provider.AbstractDrbg.lambda$static$0([email protected]/AbstractDrbg.java:524)
        at sun.security.provider.AbstractDrbg$$Lambda$41/0x000000080007b040.getEntropy([email protected]/Unknown Source)
        at sun.security.provider.AbstractDrbg.getEntropyInput([email protected]/AbstractDrbg.java:507)
        at sun.security.provider.AbstractDrbg.getEntropyInput([email protected]/AbstractDrbg.java:494)
        at sun.security.provider.AbstractDrbg.instantiateIfNecessary([email protected]/AbstractDrbg.java:696)
        - locked <0x00000001014de8f8> (a sun.security.provider.S$Impl3)
        at sun.security.provider.AbstractDrbg.engineNextBytes([email protected]/AbstractDrbg.java:378)
        at sun.security.provider.S.engineNextBytes([email protected]/S.java:138)
        at java.security.SecureRandom.nextBytes([email protected]/SecureRandom.java:782)
        - locked <0x00000001014de968> (a java.security.SecureRandom)
        at SpecTest.main(SpecTest.java:119)
        at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0([email protected]/Native Method)
        at jdk.internal.reflect.NativeMethodAccessorImpl.invoke([email protected]/NativeMethodAccessorImpl.java:62)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke([email protected]/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke([email protected]/Method.java:566)
        at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
        at java.lang.Thread.run([email protected]/Thread.java:829)

"Attach Listener" #28 daemon prio=9 os_prio=0 cpu=0.27ms elapsed=0.10s tid=0x0000fffef4001000 nid=0xc5295 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"VM Thread" os_prio=0 cpu=1.01ms elapsed=12.36s tid=0x0000ffff844e7800 nid=0xc526b runnable  

"GC Thread#0" os_prio=0 cpu=0.70ms elapsed=12.37s tid=0x0000ffff84043000 nid=0xc5266 runnable  

"G1 Main Marker" os_prio=0 cpu=0.18ms elapsed=12.37s tid=0x0000ffff840b4000 nid=0xc5267 runnable  

"G1 Conc#0" os_prio=0 cpu=0.08ms elapsed=12.37s tid=0x0000ffff840b6000 nid=0xc5268 runnable  

"G1 Refine#0" os_prio=0 cpu=0.20ms elapsed=12.37s tid=0x0000ffff8447a800 nid=0xc5269 runnable  

"G1 Young RemSet Sampling" os_prio=0 cpu=0.48ms elapsed=12.37s tid=0x0000ffff8447c800 nid=0xc526a runnable  
"VM Periodic Task Thread" os_prio=0 cpu=1.23ms elapsed=12.25s tid=0x0000ffff8460b000 nid=0xc5275 waiting on condition  

JNI global refs: 33, weak refs: 0

--- Timeout information end.
elapsed time (seconds): 12.403

Additional context

java -version
openjdk version "11.0.17" 2022-10-18
OpenJDK Runtime Environment Temurin-11.0.17+8 (build 11.0.17+8)
OpenJDK 64-Bit Server VM Temurin-11.0.17+8 (build 11.0.17+8, mixed mode)


git remote -v ; git branch ; git log -n 1
origin  [email protected]:adoptium/jdk11u.git (fetch)
origin  [email protected]:adoptium/jdk11u.git (push)
* master
commit b928a88fd564fab88f00086a62c0695ed5cc9353 (HEAD -> master, tag: jdk-11.0.18+1, origin/master, origin/HEAD)
Author: Ichiroh Takiguchi <[email protected]>
Date:   Tue Nov 1 04:50:09 2022 +0000

    8292899: CustomTzIDCheckDST.java testcase failed on AIX platform
    
    Backport-of: 3464019d7e8fe57adc910339c00ba79884c77852

mine solution: change /dev/random to /dev/urandom

image

【上游问题】-Xcomp选项java/nio/channels/DatagramChannel/StressNativeSignal.java随机超时

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147439?tab=1

ACTION: main -- Error. Program `/opt/java/openjdk/bin/java' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480223ms).
REASON: Assumed action based on file name: run main StressNativeSignal 
TIME:   480.224 seconds
messages:
command: main StressNativeSignal
reason: Assumed action based on file name: run main StressNativeSignal 
Mode: othervm
Timeout information:
Running jstack on process 3350717
2023-02-20 15:23:40
Full thread dump OpenJDK 64-Bit Server VM (17.0.6+9 compiled mode, sharing):

Threads class SMR info:
_java_thread_list=0x00007fefc8001e30, length=15, elements={
0x00007ff05c026a60, 0x00007ff05c334b80, 0x00007ff05c335f60, 0x00007ff05c33efc0,
0x00007ff05c340370, 0x00007ff05c341780, 0x00007ff05c3431f0, 0x00007ff05c344760,
0x00007ff05c345bd0, 0x00007ff05c348f00, 0x00007ff05c3501f0, 0x00007ff05c3914f0,
0x00007fefac005a80, 0x00007fefac002d30, 0x00007fefc8000e70
}

"main" #1 prio=5 os_prio=0 cpu=90.59ms elapsed=480.20s tid=0x00007ff05c026a60 nid=0x3320c0 in Object.wait()  [0x00007ff065d61000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at java.lang.Thread.join([email protected]/Thread.java:1304)
	- locked <0x0000000101073db0> (a java.lang.Thread)
	at java.lang.Thread.join([email protected]/Thread.java:1372)
	at com.sun.javatest.regtest.agent.MainWrapper.main(MainWrapper.java:74)

"Reference Handler" #2 daemon prio=10 os_prio=0 cpu=0.13ms elapsed=480.19s tid=0x00007ff05c334b80 nid=0x3320c7 waiting on condition  [0x00007fefe72ae000]
   java.lang.Thread.State: RUNNABLE
	at java.lang.ref.Reference.waitForReferencePendingList([email protected]/Native Method)
	at java.lang.ref.Reference.processPendingReferences([email protected]/Reference.java:253)
	at java.lang.ref.Reference$ReferenceHandler.run([email protected]/Reference.java:215)

"Finalizer" #3 daemon prio=8 os_prio=0 cpu=0.22ms elapsed=480.19s tid=0x00007ff05c335f60 nid=0x3320c8 in Object.wait()  [0x00007fefe71ad000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <0x0000000101002f40> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
	- locked <0x0000000101002f40> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:176)
	at java.lang.ref.Finalizer$FinalizerThread.run([email protected]/Finalizer.java:172)

"Signal Dispatcher" #4 daemon prio=9 os_prio=0 cpu=0.51ms elapsed=480.18s tid=0x00007ff05c33efc0 nid=0x3320c9 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #5 daemon prio=9 os_prio=0 cpu=0.16ms elapsed=480.18s tid=0x00007ff05c340370 nid=0x3320ca runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Monitor Deflation Thread" #6 daemon prio=9 os_prio=0 cpu=5.98ms elapsed=480.18s tid=0x00007ff05c341780 nid=0x3320cb runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #7 daemon prio=9 os_prio=0 cpu=6054.25ms elapsed=480.18s tid=0x00007ff05c3431f0 nid=0x3320cc waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"C1 CompilerThread0" #21 daemon prio=9 os_prio=0 cpu=1231.45ms elapsed=480.18s tid=0x00007ff05c344760 nid=0x3320cd waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"Sweeper thread" #28 daemon prio=9 os_prio=0 cpu=34.10ms elapsed=480.18s tid=0x00007ff05c345bd0 nid=0x3320ce runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Notification Thread" #29 daemon prio=9 os_prio=0 cpu=0.06ms elapsed=479.71s tid=0x00007ff05c348f00 nid=0x3320e5 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Common-Cleaner" #30 daemon prio=8 os_prio=0 cpu=0.49ms elapsed=479.25s tid=0x00007ff05c3501f0 nid=0x332113 in Object.wait()  [0x00007fefe5a7e000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at java.lang.ref.ReferenceQueue.remove([email protected]/ReferenceQueue.java:155)
	- locked <0x000000010101cce8> (a java.lang.ref.ReferenceQueue$Lock)
	at jdk.internal.ref.CleanerImpl.run([email protected]/CleanerImpl.java:140)
	at java.lang.Thread.run([email protected]/Thread.java:833)
	at jdk.internal.misc.InnocuousThread.run([email protected]/InnocuousThread.java:162)

"MainThread" #32 prio=5 os_prio=0 cpu=11.07ms elapsed=477.64s tid=0x00007ff05c3914f0 nid=0x332277 in Object.wait()  [0x00007fefe597d000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait([email protected]/Native Method)
	- waiting on <no object reference available>
	at java.lang.Thread.join([email protected]/Thread.java:1304)
	- locked <0x00000001017b3138> (a StressNativeSignal$UDPThread)
	at java.lang.Thread.join([email protected]/Thread.java:1372)
	at StressNativeSignal.shutdown(StressNativeSignal.java:64)
	at StressNativeSignal.main(StressNativeSignal.java:58)
	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0([email protected]/Native Method)
	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke([email protected]/NativeMethodAccessorImpl.java:77)
	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke([email protected]/DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke([email protected]/Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.lang.Thread.run([email protected]/Thread.java:833)

"Thread-1" #33 prio=5 os_prio=0 cpu=55.45ms elapsed=477.45s tid=0x00007fefac005a80 nid=0x33228c runnable  [0x00007fefe55fe000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.Net.accept([email protected]/Native Method)
	at sun.nio.ch.NioSocketImpl.accept([email protected]/NioSocketImpl.java:755)
	at java.net.ServerSocket.implAccept([email protected]/ServerSocket.java:675)
	at java.net.ServerSocket.platformImplAccept([email protected]/ServerSocket.java:641)
	at java.net.ServerSocket.implAccept([email protected]/ServerSocket.java:617)
	at java.net.ServerSocket.implAccept([email protected]/ServerSocket.java:574)
	at java.net.ServerSocket.accept([email protected]/ServerSocket.java:532)
	at StressNativeSignal$ServerSocketThread.run(StressNativeSignal.java:84)

"Thread-2" #34 prio=5 os_prio=0 cpu=78.54ms elapsed=477.44s tid=0x00007fefac002d30 nid=0x33228d runnable  [0x00007fefe51fe000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.DatagramChannelImpl.receive0([email protected]/Native Method)
	at sun.nio.ch.DatagramChannelImpl.receiveIntoNativeBuffer([email protected]/DatagramChannelImpl.java:750)
	at sun.nio.ch.DatagramChannelImpl.receive([email protected]/DatagramChannelImpl.java:736)
	at sun.nio.ch.DatagramChannelImpl.receive([email protected]/DatagramChannelImpl.java:543)
	at StressNativeSignal$UDPThread.run(StressNativeSignal.java:127)

"Attach Listener" #35 daemon prio=9 os_prio=0 cpu=0.23ms elapsed=0.10s tid=0x00007fefc8000e70 nid=0x33de15 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"VM Thread" os_prio=0 cpu=11.10ms elapsed=480.19s tid=0x00007ff05c330c00 nid=0x3320c6 runnable  

"GC Thread#0" os_prio=0 cpu=0.53ms elapsed=480.20s tid=0x00007ff05c068e10 nid=0x3320c1 runnable  

"G1 Main Marker" os_prio=0 cpu=0.05ms elapsed=480.20s tid=0x00007ff05c0795a0 nid=0x3320c2 runnable  

"G1 Conc#0" os_prio=0 cpu=0.05ms elapsed=480.20s tid=0x00007ff05c07a590 nid=0x3320c3 runnable  

"G1 Refine#0" os_prio=0 cpu=0.05ms elapsed=480.19s tid=0x00007ff05c2f46c0 nid=0x3320c4 runnable  

"G1 Service" os_prio=0 cpu=39.70ms elapsed=480.19s tid=0x00007ff05c2f55b0 nid=0x3320c5 runnable  

"VM Periodic Task Thread" os_prio=0 cpu=145.44ms elapsed=479.71s tid=0x00007ff05c34efd0 nid=0x3320e6 waiting on condition  

JNI global refs: 32, weak refs: 0

--- Timeout information end.
elapsed time (seconds): 480.224
configuration:
STDOUT:
Timeout refired 480 times
STDERR:
java.lang.NullPointerException: Cannot invoke "java.nio.channels.DatagramChannel.close()" because "this.channel" is null
	at StressNativeSignal$UDPThread.terminate(StressNativeSignal.java:139)
	at StressNativeSignal.shutdown(StressNativeSignal.java:62)
	at StressNativeSignal.main(StressNativeSignal.java:58)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)
rerun:
cd /tmp/tone/run/jtreg/jt-work/test_jdk/java/nio/channels/DatagramChannel/StressNativeSignal && \
DISPLAY=:7 \
HOME=/root \
LANG=en_US.UTF-8 \
PATH=/bin:/usr/bin:/usr/sbin \
TEST_IMAGE_DIR=/tmp/tone/run/jtreg/test-images \
CLASSPATH=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/166/java/nio/channels/DatagramChannel/StressNativeSignal.d:/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/nio/channels/DatagramChannel:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar \
    /opt/java/openjdk/bin/java \
        -Dtest.vm.opts='-Xcomp -ea -esa' \
        -Dtest.tool.vm.opts='-J-Xcomp -J-ea -J-esa' \
        -Dtest.compiler.opts= \
        -Dtest.java.opts= \
        -Dtest.jdk=/opt/java/openjdk \
        -Dcompile.jdk=/opt/java/openjdk \
        -Dtest.timeout.factor=4.0 \
        -Dtest.nativepath=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        -Dtest.root=/tmp/tone/run/jtreg/jdk-repo/test/jdk \
        -Dtest.name=java/nio/channels/DatagramChannel/StressNativeSignal.java \
        -Dtest.file=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/nio/channels/DatagramChannel/StressNativeSignal.java \
        -Dtest.src=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/nio/channels/DatagramChannel \
        -Dtest.src.path=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/nio/channels/DatagramChannel \
        -Dtest.classes=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/166/java/nio/channels/DatagramChannel/StressNativeSignal.d \
        -Dtest.class.path=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/166/java/nio/channels/DatagramChannel/StressNativeSignal.d \
        -Xcomp \
        -ea \
        -esa \
        -Djava.library.path=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        com.sun.javatest.regtest.agent.MainWrapper /tmp/tone/run/jtreg/jt-work/test_jdk/java/nio/channels/DatagramChannel/StressNativeSignal.d/main.0.jta

TEST RESULT: Error. Program `/opt/java/openjdk/bin/java' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480223ms).

replay command:

test=test/jdk/java/nio/channels/DatagramChannel/StressNativeSignal.java
nproc=`nproc` ; dir="tmp-jtreg-"`basename $test .java` ; rm -rf $dir ; mkdir -p $dir ; time seq 50 | xargs -i -n 1 -P $nproc bash -c "jtreg -Xcomp -ea -esa -timeoutFactor:2 -v:fail,error,time,nopass -nr -w $dir/index-{} $test &> $dir/{}.log ; grep 'Test results: passed: 1' -L $dir/{}.log"

12.log

[upstream]aarch64平台gtest/LargePageGtests.java#use-large-pages失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/148462?tab=2

Command line: [/tmp/tone/run/jtreg/test-images/hotspot/gtest/server/gtestLauncher -jdk /opt/java/openjdk --gtest_output=xml:test_result.xml --gtest_catch_exceptions=0 --gtest_filter=os* -XX:-PrintWarnings -XX:+UseLargePages]
[2023-02-22T22:24:46.032290044Z] Gathering output for process 374135
[2023-02-22T22:24:46.505727012Z] Waiting for completion for process 374135
[2023-02-22T22:24:46.506120661Z] Waiting for completion finished for process 374135
Output and diagnostic info for process 374135 was saved into 'pid-374135-output.log'
Note: Google Test filter = os*
[==========] Running 36 tests from 3 test cases.
[----------] Global test environment set-up.
[----------] 24 tests from os
[ RUN      ] os.page_size_for_region_vm
[0.001s][warning][pagesize] UseLargePages disabled, no large pages configured and available on the system.
[       OK ] os.page_size_for_region_vm (0 ms)
[ RUN      ] os.page_size_for_region_aligned_vm
[       OK ] os.page_size_for_region_aligned_vm (0 ms)
[ RUN      ] os.page_size_for_region_alignment_vm
[       OK ] os.page_size_for_region_alignment_vm (0 ms)
[ RUN      ] os.page_size_for_region_unaligned_vm
[       OK ] os.page_size_for_region_unaligned_vm (0 ms)
[ RUN      ] os.test_random
[       OK ] os.test_random (0 ms)
[ RUN      ] os.test_print_hex_dump_vm
[       OK ] os.test_print_hex_dump_vm (1 ms)
[ RUN      ] os.vsnprintf_vm
[       OK ] os.vsnprintf_vm (0 ms)
[ RUN      ] os.snprintf_vm
[       OK ] os.snprintf_vm (0 ms)
[ RUN      ] os.jio_vsnprintf_vm
[       OK ] os.jio_vsnprintf_vm (0 ms)
[ RUN      ] os.jio_snprintf_vm
[       OK ] os.jio_snprintf_vm (0 ms)
[ RUN      ] os.release_multi_mappings_vm
A
Range [ffff96466000-ffff9646c000) contains: 
ffff96466000-ffff96467000 rwxp 00000000 00:00 0 
ffff96467000-ffff96468000 rw-p 00000000 00:00 0 
ffff96468000-ffff96469000 rwxp 00000000 00:00 0 
ffff96469000-ffff9646a000 rw-p 00000000 00:00 0 
ffff9646a000-ffff9646b000 rwxp 00000000 00:00 0 
ffff9646b000-ffff9646c000 rw-p 00000000 00:00 0 

B
Range [ffff96466000-ffff9646c000) contains: 
ffff96466000-ffff96467000 rwxp 00000000 00:00 0 
ffff9646b000-ffff9646c000 rw-p 00000000 00:00 0 

C
Range [ffff96466000-ffff9646c000) contains: 
ffff96466000-ffff96467000 rwxp 00000000 00:00 0 
ffff96467000-ffff9646b000 ---p 00000000 00:00 0 
ffff9646b000-ffff9646c000 rw-p 00000000 00:00 0 

[       OK ] os.release_multi_mappings_vm (0 ms)
[ RUN      ] os.release_one_mapping_multi_commits_vm
A
Range [ffff20584000-ffff21584000) contains: 
ffff20584000-ffff20984000 rw-p 00000000 00:00 0 
ffff20984000-ffff20d84000 ---p 00000000 00:00 0 
ffff20d84000-ffff21184000 rw-p 00000000 00:00 0 
ffff21184000-ffff21584000 ---p 00000000 00:00 0 

B
Range [ffff20584000-ffff21584000) contains: 
nothing.

test/hotspot/gtest/runtime/test_os.cpp:528: Failure
Expected equality of these values:
  p2
    Which is: NULL
  p
    Which is: 0xffff20584000
[  FAILED  ] os.release_one_mapping_multi_commits_vm (0 ms)
[ RUN      ] os.show_mappings_small_range_vm
[       OK ] os.show_mappings_small_range_vm (1 ms)
[ RUN      ] os.show_mappings_full_range_vm
[       OK ] os.show_mappings_full_range_vm (0 ms)
[ RUN      ] os.os_pagesizes_vm
4k, 2M
[       OK ] os.os_pagesizes_vm (0 ms)
[ RUN      ] os.pagesizes_test_range_vm
[       OK ] os.pagesizes_test_range_vm (0 ms)
[ RUN      ] os.pagesizes_test_print_vm
[       OK ] os.pagesizes_test_print_vm (0 ms)
[ RUN      ] os.dll_address_to_function_and_library_name_vm
[       OK ] os.dll_address_to_function_and_library_name_vm (12 ms)
[ RUN      ] os.iso8601_time_vm
2023-02-22T22:24:46.464+0000
2023-02-22T22:24:46.464+0000
1970-01-01T00:00:00.000+0000
1970-01-01T00:00:00.017+0000
[       OK ] os.iso8601_time_vm (0 ms)
[ RUN      ] os.is_first_C_frame_vm
[       OK ] os.is_first_C_frame_vm (0 ms)
[ RUN      ] os.safefetch_can_use_vm
[       OK ] os.safefetch_can_use_vm (0 ms)
[ RUN      ] os.safefetch_positive_vm
[       OK ] os.safefetch_positive_vm (0 ms)
[ RUN      ] os.safefetch_negative_vm
[       OK ] os.safefetch_negative_vm (0 ms)
[ RUN      ] os.safefetch_negative_at_safepoint_vm
[       OK ] os.safefetch_negative_at_safepoint_vm (9 ms)
[----------] 24 tests from os (479 ms total)

[----------] 7 tests from os_linux
[ RUN      ] os_linux.reserve_memory_special_huge_tlbfs_size_aligned_vm
[       OK ] os_linux.reserve_memory_special_huge_tlbfs_size_aligned_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_without_addr_vm
[       OK ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_without_addr_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_with_good_req_addr_vm
[       OK ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_with_good_req_addr_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_with_bad_req_addr_vm
[       OK ] os_linux.reserve_memory_special_huge_tlbfs_size_not_aligned_with_bad_req_addr_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_shm_vm
[       OK ] os_linux.reserve_memory_special_shm_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_vm
[       OK ] os_linux.reserve_memory_special_vm (0 ms)
[ RUN      ] os_linux.reserve_memory_special_concurrent_vm
[       OK ] os_linux.reserve_memory_special_concurrent_vm (0 ms)
[----------] 7 tests from os_linux (0 ms total)

[----------] 5 tests from ostream
[ RUN      ] ostream.stringStream_dynamic_start_with_internal_buffer_vm
[       OK ] ostream.stringStream_dynamic_start_with_internal_buffer_vm (0 ms)
[ RUN      ] ostream.stringStream_dynamic_start_with_malloced_buffer_vm
[       OK ] ostream.stringStream_dynamic_start_with_malloced_buffer_vm (0 ms)
[ RUN      ] ostream.stringStream_static_vm
[       OK ] ostream.stringStream_static_vm (0 ms)
[ RUN      ] ostream.bufferedStream_static_vm
[       OK ] ostream.bufferedStream_static_vm (0 ms)
[ RUN      ] ostream.bufferedStream_dynamic_small_vm
[       OK ] ostream.bufferedStream_dynamic_small_vm (0 ms)
[----------] 5 tests from ostream (0 ms total)

[----------] Global test environment tear-down
[==========] 36 tests from 3 test cases ran. (479 ms total)
[  PASSED  ] 35 tests.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] os.release_one_mapping_multi_commits_vm

 1 FAILED TEST
ERROR: RUN_ALL_TESTS() failed. Error 1

[2023-02-22T22:24:46.515201753Z] Waiting for completion for process 374135
[2023-02-22T22:24:46.515344949Z] Waiting for completion finished for process 374135
STDERR:
java.lang.AssertionError: gtest execution failed; exit code = 2. the failed tests: [os::release_one_mapping_multi_commits_vm]
	at GTestWrapper.main(GTestWrapper.java:98)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

JavaTest Message: Test threw exception: java.lang.AssertionError: gtest execution failed; exit code = 2. the failed tests: [os::release_one_mapping_multi_commits_vm]
JavaTest Message: shutting down test

[Wisp] Port Wisp1/2 to JDK17.

[Wisp] Port Wisp1/2 to JDK17.

Summary:

  • Port Wisp1/2 to JDK11 includes the changes in 8.6.11_fp1
  • initializing process improvement
  • WispCounterMXBean and WispPerfCounterMonitor aren't supported

Test Plan:
test/hotspot/jtreg/runtime/coroutine/
test/jdk/com/alibaba/wisp
test/jdk/com/alibaba/wisp2

Dragonwell17 not support Wisp2?

JDK version:

openjdk version "17.0.2" 2022-01-18
OpenJDK Runtime Environment (Alibaba Dragonwell)-17.0.2.0.2+8-GA (build 17.0.2+8)
OpenJDK 64-Bit Server VM (Alibaba Dragonwell)-17.0.2.0.2+8-GA (build 17.0.2+8, mixed mode, sharing)

OS:

Ubuntu 21.10

question:

Run a jar file with “-XX:+UnlockExperimentalVMOptions -XX:+UseWisp2”,console print "Unrecognized VM option 'UseWisp2'"
shell:

java -XX:+UnlockExperimentalVMOptions -XX:+UseWisp2 -jar xxxxxxxxxx-1.4.1-RELEASE.jar --spring.profiles.active=dev

print

Unrecognized VM option 'UseWisp2'
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.

Question: Releases marked as prerelease

Just saw that the releases currently marked as prerelease=true but the packages are already available for download even on the Dragonwell website. Question is if there will be "real" releases or will the JDK17 repo always use prereleases?
Thank's in advance...

[upstream]gtest/LargePageGtests.java#use-large-pages-sysV随机失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/148919?tab=2

A
Range [7fdb66fb9000-7fdb66fbf000) contains: 
7fdb66fb9000-7fdb66fba000 rwxp 00000000 00:00 0 
7fdb66fba000-7fdb66fbb000 rw-p 00000000 00:00 0 
7fdb66fbb000-7fdb66fbc000 rwxp 00000000 00:00 0 
7fdb66fbc000-7fdb66fbd000 rw-p 00000000 00:00 0 
7fdb66fbd000-7fdb66fbe000 rwxp 00000000 00:00 0 
7fdb66fbe000-7fdb67000000 rw-p 00000000 00:00 0 

B
Range [7fdb66fb9000-7fdb66fbf000) contains: 
7fdb66fb9000-7fdb66fba000 rwxp 00000000 00:00 0 
7fdb66fbe000-7fdb67000000 rw-p 00000000 00:00 0 

C
Range [7fdb66fb9000-7fdb66fbf000) contains: 
7fdb66fb9000-7fdb66fba000 rwxp 00000000 00:00 0 
7fdb66fba000-7fdb66fbe000 ---p 00000000 00:00 0 
7fdb66fbe000-7fdb67000000 rw-p 00000000 00:00 0 

[       OK ] os.release_multi_mappings_vm (0 ms)
[ RUN      ] os.release_one_mapping_multi_commits_vm
A
Range [7fdab7000000-7fdab8000000) contains: 
7fdab7000000-7fdab7400000 rw-p 00000000 00:00 0 
7fdab7400000-7fdab7800000 ---p 00000000 00:00 0 
7fdab7800000-7fdab7c00000 rw-p 00000000 00:00 0 
7fdab7c00000-7fdab8000000 ---p 00000000 00:00 0 

B
Range [7fdab7000000-7fdab8000000) contains: 
nothing.

test/hotspot/gtest/runtime/test_os.cpp:528: Failure
Expected equality of these values:
  p2
    Which is: NULL
  p
    Which is: 0x7fdab7000000
[  FAILED  ] os.release_one_mapping_multi_commits_vm (1 ms)

【上游问题】java/util/DoubleStreamSums/CompensatedSums.java随机失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/152124?tab=2

STDOUT:
Seed from RandomFactory = 7356473784989964689L
test CompensatedSums.testCompensatedSums(): failure
java.lang.AssertionError: expected [true] but found [false]
	at org.testng.Assert.fail(Assert.java:99)
	at org.testng.Assert.failNotEquals(Assert.java:1037)
	at org.testng.Assert.assertTrue(Assert.java:45)
	at org.testng.Assert.assertTrue(Assert.java:55)
	at CompensatedSums.testCompensatedSums(CompensatedSums.java:94)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:132)
	at org.testng.internal.TestInvoker.invokeMethod(TestInvoker.java:599)
	at org.testng.internal.TestInvoker.invokeTestMethod(TestInvoker.java:174)
	at org.testng.internal.MethodRunner.runInSequence(MethodRunner.java:46)
	at org.testng.internal.TestInvoker$MethodInvocationAgent.invoke(TestInvoker.java:822)
	at org.testng.internal.TestInvoker.invokeTestMethods(TestInvoker.java:147)
	at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:146)
	at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:128)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
	at org.testng.TestRunner.privateRun(TestRunner.java:764)
	at org.testng.TestRunner.run(TestRunner.java:585)
	at org.testng.SuiteRunner.runTest(SuiteRunner.java:384)
	at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:378)
	at org.testng.SuiteRunner.privateRun(SuiteRunner.java:337)
	at org.testng.SuiteRunner.run(SuiteRunner.java:286)
	at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:53)
	at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:96)
	at org.testng.TestNG.runSuitesSequentially(TestNG.java:1218)
	at org.testng.TestNG.runSuitesLocally(TestNG.java:1140)
	at org.testng.TestNG.runSuites(TestNG.java:1069)
	at org.testng.TestNG.run(TestNG.java:1037)
	at com.sun.javatest.regtest.agent.TestNGRunner.main(TestNGRunner.java:94)
	at com.sun.javatest.regtest.agent.TestNGRunner.main(TestNGRunner.java:54)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

===============================================
java/util/DoubleStreamSums/CompensatedSums.java
Total tests run: 1, Passes: 0, Failures: 1, Skips: 0
===============================================

STDERR:
java.lang.Exception: failures: 1
	at com.sun.javatest.regtest.agent.TestNGRunner.main(TestNGRunner.java:96)
	at com.sun.javatest.regtest.agent.TestNGRunner.main(TestNGRunner.java:54)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

JavaTest Message: Test threw exception: java.lang.Exception: failures: 1
JavaTest Message: shutting down test

STATUS:Failed.`main' threw exception: java.lang.Exception: failures: 1
rerun:
cd /tmp/tone/run/jtreg/jt-work/test_jdk/java/util/DoubleStreamSums/CompensatedSums && \
DISPLAY=:7 \
HOME=/root \
LANG=en_US.UTF-8 \
PATH=/bin:/usr/bin:/usr/sbin \
TEST_IMAGE_DIR=/tmp/tone/run/jtreg/test-images \
CLASSPATH=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/83/java/util/DoubleStreamSums/CompensatedSums.d:/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/util/DoubleStreamSums:/tmp/tone/run/jtreg/jt-work/test_jdk/classes/83/test/lib:/tmp/tone/run/jtreg/jdk-repo/test/lib:/tmp/tone/run/jtreg/jtreg/lib/testng.jar:/tmp/tone/run/jtreg/jtreg/lib/jcommander.jar:/tmp/tone/run/jtreg/jtreg/lib/guice.jar:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar \
    /opt/java/openjdk/bin/java \
        -Dtest.vm.opts='-Xmixed -ea -esa' \
        -Dtest.tool.vm.opts='-J-Xmixed -J-ea -J-esa' \
        -Dtest.compiler.opts= \
        -Dtest.java.opts= \
        -Dtest.jdk=/opt/java/openjdk \
        -Dcompile.jdk=/opt/java/openjdk \
        -Dtest.timeout.factor=4.0 \
        -Dtest.nativepath=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        -Dtest.root=/tmp/tone/run/jtreg/jdk-repo/test/jdk \
        -Dtest.name=java/util/DoubleStreamSums/CompensatedSums.java \
        -Dtest.file=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/util/DoubleStreamSums/CompensatedSums.java \
        -Dtest.src=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/util/DoubleStreamSums \
        -Dtest.src.path=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/util/DoubleStreamSums:/tmp/tone/run/jtreg/jdk-repo/test/lib \
        -Dtest.classes=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/83/java/util/DoubleStreamSums/CompensatedSums.d \
        -Dtest.class.path=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/83/java/util/DoubleStreamSums/CompensatedSums.d:/tmp/tone/run/jtreg/jt-work/test_jdk/classes/83/test/lib \
        -Xmixed \
        -ea \
        -esa \
        -Djava.library.path=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        com.sun.javatest.regtest.agent.MainWrapper /tmp/tone/run/jtreg/jt-work/test_jdk/java/util/DoubleStreamSums/CompensatedSums.d/testng.0.jta java/util/DoubleStreamSums/CompensatedSums.java false CompensatedSums

TEST RESULT: Failed. Execution failed: `main' threw exception: java.lang.Exception: failures: 1

【上游问题】-Xcomp选项运行用例java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.java随机报错

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147439?tab=1

STDOUT:

Regression test for bug 4127826

EchoServer: creating remote object
EchoServer: binding in registry
STDERR:
WARNING: A command line option has enabled the Security Manager
WARNING: The Security Manager is deprecated and will be removed in a future release
test policy: =/tmp/tone/run/jtreg/jt-work/test_jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory/security.policy_new

using protocol: none
JAVAVM: command = [/opt/java/openjdk/bin/java, -Xcomp, -ea, -esa, -Djava.security.manager=allow, -Djava.security.policy==/tmp/tone/run/jtreg/jt-work/test_jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory/security.policy_new, -Drmi.registry.port=33109, EchoImpl]
WARNING: A terminally deprecated method in java.lang.System has been called
WARNING: System::setSecurityManager has been called by TestLibrary (file:/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/testlibrary/)
WARNING: Please consider reporting this to the maintainers of TestLibrary
WARNING: System::setSecurityManager will be removed in a future release
TEST FAILED: server not bound in 8 tries
TEST FAILED: 
Test failed with: EchoServer
java.rmi.NotBoundException: EchoServer
	at java.rmi/sun.rmi.registry.RegistryImpl.unbind(RegistryImpl.java:273)
	at java.rmi/sun.rmi.registry.RegistryImpl_Skel.dispatch(RegistryImpl_Skel.java:186)
	at java.rmi/sun.rmi.server.UnicastServerRef.oldDispatch(UnicastServerRef.java:470)
	at java.rmi/sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:299)
	at java.rmi/sun.rmi.transport.Transport$1.run(Transport.java:200)
	at java.rmi/sun.rmi.transport.Transport$1.run(Transport.java:197)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.rmi/sun.rmi.transport.Transport.serviceCall(Transport.java:196)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:587)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:828)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:705)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:704)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:833)
	at java.rmi/sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:304)
	at java.rmi/sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:280)
	at java.rmi/sun.rmi.server.UnicastRef.invoke(UnicastRef.java:381)
	at java.rmi/sun.rmi.registry.RegistryImpl_Stub.unbind(RegistryImpl_Stub.java:180)
	at java.rmi/java.rmi.Naming.unbind(Naming.java:152)
	at UseCustomSocketFactory.main(UseCustomSocketFactory.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)
TestFailedException: TEST FAILED: ; nested exception is: 
	java.rmi.NotBoundException: EchoServer
java.rmi.NotBoundException: EchoServer
	at java.rmi/sun.rmi.registry.RegistryImpl.unbind(RegistryImpl.java:273)
	at java.rmi/sun.rmi.registry.RegistryImpl_Skel.dispatch(RegistryImpl_Skel.java:186)
	at java.rmi/sun.rmi.server.UnicastServerRef.oldDispatch(UnicastServerRef.java:470)
	at java.rmi/sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:299)
	at java.rmi/sun.rmi.transport.Transport$1.run(Transport.java:200)
	at java.rmi/sun.rmi.transport.Transport$1.run(Transport.java:197)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.rmi/sun.rmi.transport.Transport.serviceCall(Transport.java:196)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:587)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:828)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:705)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
	at java.rmi/sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:704)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:833)
	at java.rmi/sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:304)
	at java.rmi/sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:280)
	at java.rmi/sun.rmi.server.UnicastRef.invoke(UnicastRef.java:381)
	at java.rmi/sun.rmi.registry.RegistryImpl_Stub.unbind(RegistryImpl_Stub.java:180)
	at java.rmi/java.rmi.Naming.unbind(Naming.java:152)
	at UseCustomSocketFactory.main(UseCustomSocketFactory.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

JavaTest Message: Test threw exception: TestFailedException: TEST FAILED: ; nested exception is: 
	java.rmi.NotBoundException: EchoServer
JavaTest Message: shutting down test

STATUS:Failed.`main' threw exception: TestFailedException: TEST FAILED: ; nested exception is: java.rmi.NotBoundException: EchoServer
rerun:
cd /tmp/tone/run/jtreg/jt-work/test_jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory && \
DISPLAY=:7 \
HOME=/root \
LANG=en_US.UTF-8 \
PATH=/bin:/usr/bin:/usr/sbin \
TEST_IMAGE_DIR=/tmp/tone/run/jtreg/test-images \
CLASSPATH=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.d:/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast:/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/testlibrary:/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/testlibrary:/tmp/tone/run/jtreg/jtreg/lib/javatest.jar:/tmp/tone/run/jtreg/jtreg/lib/jtreg.jar \
    /opt/java/openjdk/bin/java \
        -Dtest.vm.opts='-Xcomp -ea -esa' \
        -Dtest.tool.vm.opts='-J-Xcomp -J-ea -J-esa' \
        -Dtest.compiler.opts= \
        -Dtest.java.opts= \
        -Dtest.jdk=/opt/java/openjdk \
        -Dcompile.jdk=/opt/java/openjdk \
        -Dtest.timeout.factor=4.0 \
        -Dtest.nativepath=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        -Dtest.root=/tmp/tone/run/jtreg/jdk-repo/test/jdk \
        -Dtest.name=java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.java \
        -Dtest.file=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.java \
        -Dtest.src=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast \
        -Dtest.src.path=/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast:/tmp/tone/run/jtreg/jdk-repo/test/jdk/java/rmi/testlibrary \
        -Dtest.classes=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.d \
        -Dtest.class.path=/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.d:/tmp/tone/run/jtreg/jt-work/test_jdk/classes/190/java/rmi/testlibrary \
        -Dtest.modules='java.rmi/sun.rmi.registry java.rmi/sun.rmi.server java.rmi/sun.rmi.transport java.rmi/sun.rmi.transport.tcp' \
        -Djava.security.policy==/tmp/tone/run/jtreg/jt-work/test_jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory/security.policy_new \
        -Djava.security.manager=default \
        --add-modules java.rmi \
        --add-exports java.rmi/sun.rmi.registry=ALL-UNNAMED \
        --add-exports java.rmi/sun.rmi.server=ALL-UNNAMED \
        --add-exports java.rmi/sun.rmi.transport=ALL-UNNAMED \
        --add-exports java.rmi/sun.rmi.transport.tcp=ALL-UNNAMED \
        -Xcomp \
        -ea \
        -esa \
        -Djava.library.path=/tmp/tone/run/jtreg/test-images/hotspot/jtreg/native \
        com.sun.javatest.regtest.agent.MainWrapper /tmp/tone/run/jtreg/jt-work/test_jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.d/main.3.jta

TEST RESULT: Failed. Execution failed: `main' threw exception: TestFailedException: TEST FAILED: ; nested exception is: java.rmi.NotBoundException: EchoServer

replay command:

test=test/jdk/java/rmi/server/RMISocketFactory/useSocketFactory/unicast/UseCustomSocketFactory.java
nproc=`nproc` ; dir="tmp-jtreg-"`basename $test .java` ; rm -rf $dir ; mkdir -p $dir ; time seq 50 | xargs -i -n 1 -P $nproc bash -c "jtreg -Xcomp -ea -esa -timeoutFactor:2 -v:fail,error,time,nopass -nr -w $dir/index-{} $test &> $dir/{}.log ; grep 'Test results: passed: 1' -L $dir/{}.log"

复现概率:48/50

9.log
8.log
7.log
6.log
5.log
4.log
48.log

[upstream]64核倚天710环境-XX:-UseCompressedOops选项运行用例runtime/handshake/HandshakeDirectTest.java随机crash

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147441?tab=1

command: main -XX:+UnlockDiagnosticVMOptions -XX:+UseBiasedLocking -XX:+SafepointALot -XX:BiasedLockingDecayTime=100000000 -XX:BiasedLockingBulkRebiasThreshold=1000000 -XX:BiasedLockingBulkRevokeThreshold=1000000 HandshakeDirectTest
reason: User specified action: run main/othervm -XX:+UnlockDiagnosticVMOptions -XX:+UseBiasedLocking -XX:+SafepointALot -XX:BiasedLockingDecayTime=100000000 -XX:BiasedLockingBulkRebiasThreshold=1000000 -XX:BiasedLockingBulkRevokeThreshold=1000000 HandshakeDirectTest 
Mode: othervm [/othervm specified]
elapsed time (seconds): 6.25
configuration:
STDOUT:
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000ffffad9620e0, pid=464524, tid=465210
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9, mixed mode, sharing, tiered, compressed class ptrs, g1 gc, linux-aarch64)
# Problematic frame:
# V  [libjvm.so+0x6820e0][thread 465224 also had an error]
  frame::sender(RegisterMap*) const+0x140
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e" (or dumping to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/runtime/handshake/HandshakeDirectTest/core.464524)
#
# An error report file with more information is saved as:
# /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/runtime/handshake/HandshakeDirectTest/hs_err_pid464524.log
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#
test=test/hotspot/jtreg/runtime/handshake/HandshakeDirectTest.java
nproc=`nproc` ; dir="tmp-jtreg-"`basename $test .java` ; rm -rf $dir ; mkdir -p $dir ; time seq 50 | xargs -i -n 1 -P `expr $nproc / 2` bash -c "jtreg -XX:-UseCompressedOops -ea -esa -timeoutFactor:4 -v:fail,error,time,nopass -nr -w $dir/index-{} $test &> $dir/{}.log ; grep 'Test results: passed: 1' -L $dir/{}.log"

复现概率:2/50
45.log
42.log

目前只在倚天710 64核的实例上复现

image

启动报错

Unrecognized VM option 'G1ElasticHeap'

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

[upstream]aarch64平台gtest/GTestWrapper.java随机crash

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/149319?tab=2

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x0000000000000000, pid=373764, tid=377187
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6-internal+0-adhoc..jdk-repo, mixed mode, tiered, compressed oops, compressed class ptrs, g1 gc, linux-aarch64)
# Problematic frame:
# C  0x0000000000000000
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h %e" (or dumping to /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/GTestWrapper/core.373764)
#
# An error report file with more information is saved as:
# /tmp/tone/run/jtreg/jt-work/hotspot_jtreg/gtest/GTestWrapper/hs_err_pid373764.log
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#
Warn

【上游问题】-Xcomp选项运行用例jdk/jfr/event/compiler/TestDeoptimization.java必现失败

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147439?tab=1

STDOUT:
Deoptimization Test
Deoptimized
Time to load, compile and deoptimize dummyMethod: 98
STDERR:
java.lang.RuntimeException: couldn't find any jdk.Deoptimization for ids : [12150, 12155, 12149, 12154]: expected false, was true
	at jdk.test.lib.Asserts.fail(Asserts.java:594)
	at jdk.test.lib.Asserts.assertFalse(Asserts.java:461)
	at jdk.jfr.event.compiler.TestDeoptimization.doTest(TestDeoptimization.java:124)
	at jdk.jfr.event.compiler.TestDeoptimization.main(TestDeoptimization.java:69)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

【上游问题】javax/accessibility/JFileChooserAccessibleDescriptionTest.java随机失败java.lang.RuntimeException: Accessibility Description forJFileChooser is not Set

STDERR:
java.lang.RuntimeException: Accessibility Description forJFileChooser is not Set
	at JFileChooserAccessibleDescriptionTest.doTest(JFileChooserAccessibleDescriptionTest.java:90)
	at JFileChooserAccessibleDescriptionTest.main(JFileChooserAccessibleDescriptionTest.java:99)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/146375?tab=1

[Wisp] Port bug fixes and Multi-tenant support to JDK17.

[Wisp] Port bug fixes and Multi-tenant support to JDK17.

Summary:

  1. Port Multi-tenant support
  2. AOT support for compiledMethods_do
  3. Safepoint based wisp preempt jdk part
  4. support fd use across coroutines
  5. fix close WispSocket won't wakeup blocked fd
  6. remove the parent task which is introduced by FutureTask
  7. solve block issue with nio accept

Test Plan: all wisp cases

[upstream]art-test 530-checker-loops2 c1 jit SIGSEGV

test command:

java -Xcomp -XX:TieredStopAtLevel=1 Main 

result:

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007f123d43ad97, pid=1631317, tid=1631318
#
# JRE version: OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9) (build 17.0.6+9)
# Java VM: OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (17.0.6+9, compiled mode, emulated-client, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# J 443 c1 Main.hiddenInfiniteOOB()V (53 bytes) @ 0x00007f123d43ad97 [0x00007f123d43aca0+0x00000000000000f7]
#
# Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -- %E" (or dumping to /home/yansendao/tmp/core.1631317)
#
# An error report file with more information is saved as:
# /home/yansendao/tmp/hs_err_pid1631317.log
Compiled method (c1)     155  443       1       Main::hiddenInfiniteOOB (53 bytes)
 total in heap  [0x00007f123d43ab10,0x00007f123d43af28] = 1048
 relocation     [0x00007f123d43ac70,0x00007f123d43aca0] = 48
 main code      [0x00007f123d43aca0,0x00007f123d43ae40] = 416
 stub code      [0x00007f123d43ae40,0x00007f123d43ae70] = 48
 oops           [0x00007f123d43ae70,0x00007f123d43ae78] = 8
 metadata       [0x00007f123d43ae78,0x00007f123d43ae80] = 8
 scopes data    [0x00007f123d43ae80,0x00007f123d43aec0] = 64
 scopes pcs     [0x00007f123d43aec0,0x00007f123d43af20] = 96
 dependencies   [0x00007f123d43af20,0x00007f123d43af28] = 8
#
# If you would like to submit a bug report, please visit:
#   mailto:[email protected]
#
Aborted (core dumped)

hs_err_pid1631317.log

jdk22 run also SIGSEGV:

image

hs_err_pid1631276.log
Main.java.txt

creduce.sh.txt

裁剪结果:

class Main {
  static int b;
  static void c() {
    int[] a = {};
    for (int f = -1;; f++)
      for (int d = 4; d < 2147483646 * f - 3; d++)
        b = a[d + 4];
  }
  public static void main(String[] args) {
    try {
      c();
    } catch (ArrayIndexOutOfBoundsException e) {
    }
  }
}

【上游问题】compiler/intrinsics/sha/cli/TestUseSHA3IntrinsicsOptionOnSupportedCPU.java报错

https://tone.aliyun-inc.com/ws/xesljfzh/test_result/147251?tab=1

 stderr: [openjdk version "17.0.6" 2023-01-17
OpenJDK Runtime Environment (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (build 17.0.6+9)
OpenJDK 64-Bit Server VM (Alibaba Dragonwell Standard Edition)-17.0.6.0.6+9-GA (build 17.0.6+9, mixed mode, sharing)
]
 exitValue = 0

java.lang.AssertionError: Option 'UseSHA3Intrinsics' is expected to have 'true' value
Option 'UseSHA3Intrinsics' should be enabled by default
	at jdk.test.lib.cli.CommandLineOptionTest.verifyOptionValue(CommandLineOptionTest.java:307)
	at jdk.test.lib.cli.CommandLineOptionTest.verifyOptionValue(CommandLineOptionTest.java:280)
	at jdk.test.lib.cli.CommandLineOptionTest.verifyOptionValueForSameVM(CommandLineOptionTest.java:404)
	at compiler.intrinsics.sha.cli.testcases.GenericTestCaseForSupportedCPU.verifyOptionValues(GenericTestCaseForSupportedCPU.java:101)
	at compiler.intrinsics.sha.cli.DigestOptionsBase$TestCase.test(DigestOptionsBase.java:163)
	at compiler.intrinsics.sha.cli.DigestOptionsBase.runTestCases(DigestOptionsBase.java:139)
	at jdk.test.lib.cli.CommandLineOptionTest.test(CommandLineOptionTest.java:535)
	at compiler.intrinsics.sha.cli.TestUseSHA3IntrinsicsOptionOnSupportedCPU.main(TestUseSHA3IntrinsicsOptionOnSupportedCPU.java:46)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)
	at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.RuntimeException: 'UseSHA3Intrinsics\s*:?=\s*true' missing from stdout/stderr 

	at jdk.test.lib.process.OutputAnalyzer.shouldMatch(OutputAnalyzer.java:340)
	at jdk.test.lib.cli.CommandLineOptionTest.verifyOptionValue(CommandLineOptionTest.java:299)
	... 13 more

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.