GithubHelp home page GithubHelp logo

kong / ngx_wasm_module Goto Github PK

View Code? Open in Web Editor NEW
67.0 11.0 6.0 2.97 MB

Nginx + WebAssembly

License: Apache License 2.0

Makefile 0.33% Shell 9.61% Rust 9.32% C 76.27% Perl 2.89% Awk 0.02% C++ 0.03% Lua 1.52%
nginx webassembly proxy-wasm rust wasm go assemblyscript

ngx_wasm_module's Introduction

WasmX logo

WasmX/ngx_wasm_module

Nginx + WebAssembly

This module enables the embedding of WebAssembly runtimes inside of Nginx and aims at offering several host SDK abstractions for the purpose of extending and/or introspecting the Nginx web-server/proxy runtime.

Currently, the module implements a Proxy-Wasm host ABI, which allows the use of client SDKs written in multiple languages, such as Rust and Go. Proxy-Wasm ("WebAssembly for Proxies") is an emerging standard for Wasm filters, adopted by API Gateways such as Kong and Envoy.

What is WasmX?

WasmX aims at extending Nginx for the modern Web infrastructure. This includes supporting WebAssembly runtimes & SDKs (by way of ngx_wasm_module), and generally increasing the breadth of features relied upon by the API Gateway use-case (i.e. reverse-proxying). See CONTRIBUTING.md for additional background and roadmap information.

Table of Contents

Synopsis

# nginx.conf
events {}

# nginx master process gets a default 'main' VM
# a new top-level configuration block receives all configuration for this main VM
wasm {
    #      [name]    [path.{wasm,wat}]
    module my_filter /path/to/filter.wasm;
    module my_module /path/to/module.wasm;
}

# each nginx worker process is able to instantiate wasm modules in its subsystems
http {
    server {
        listen 9000;

        location / {
            # execute a proxy-wasm filter when proxying
            #           [module]
            proxy_wasm  my_filter;

            # execute more WebAssembly during the access phase
            #           [phase] [module]  [function]
            wasm_call   access  my_module check_something;

            proxy_pass  ...;
        }
    }

    # other directives
    wasm_socket_connect_timeout 60s;
    wasm_socket_send_timeout    60s;
    wasm_socket_read_timeout    60s;

    wasm_socket_buffer_size     8k;
    wasm_socket_large_buffers   32 16k;
}

Back to TOC

Examples

Several "showcase filters" are provided as examples by authors of this module:

More examples are available for each Proxy-Wasm SDK:

Note that all of the above examples may not yet be compatible with ngx_wasm_module.

Last but not least, the WebAssembly Hub contains many other Proxy-Wasm filters, some of which may not yet be compatible with ngx_wasm_module.

Back to TOC

Documentation

Usage

See the user documentation for resources on this module's usage.

Back to TOC

Installation

Releases are published in three distinct release channels:

  • Release: Stable releases. A prerelease is considered stable and promoted to a release based on usage mileage and feedback.
  • Prerelease: Unstable releases. All new release versions (e.g. release-1.0.0) are first introduced through prereleases (i.e. prerelease-1.0.0-beta1) before being promoted to a stable release.
  • Nightly: Releases cut from the latest main branch. Presently, nightly releases are built every Monday. The release interval may change in the future. See the Nightly release tag to download released artifacts.

Each release channel produces the following artifacts for each release:

  • ngx_wasm_module-$release.tar.gz: a tarball of the ngx_wasm_module release. To be compiled alongside Nginx with --add-module= or --add-dynamic-module=.
  • wasmx-$release-$runtime-$arch-$os.tar.gz: a pre-compiled nginx executable built with ngx_wasm_module for the specified runtime/architecture/OS. Download these releases and instantly use the nginx binary.

See the installation documentation for instructions on how to install this module or use one of the binary releases.

Back to TOC

Development

See the developer documentation for developer resources on building this module from source and other general development processes.

See a term you are unfamiliar with? Consult the code lexicon.

For a primer on the code's layout and architecture, see the code layout section.

Back to TOC

Proxy-Wasm SDK

The Proxy-Wasm SDK is the initial focus of WasmX/ngx_wasm_module development and is still a work in progress. You can browse PROXY_WASM.md for a guide on Proxy-Wasm support in ngx_wasm_module.

For a reliable resource in an evolving ABI specification, you may also wish to consult the SDK source of the language of your choice in the Proxy-Wasm SDKs list.

Back to TOC

WebAssembly

Back to TOC

WebAssembly Runtimes

Back to TOC

License

Copyright 2020-2024 Kong Inc.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Back to TOC

ngx_wasm_module's People

Contributors

casimiro avatar dependabot[bot] avatar dndx avatar flrgh avatar hishamhm avatar javierguerragiraldez avatar losfair avatar marcsvll avatar pluveto avatar subnetmarco avatar t-yuki avatar thibaultcha avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ngx_wasm_module's Issues

Fix loading filters from OpenResty with dynamic ngx_wasm_module.so

We currently can't load filters from OpenResty when WasmX is used as a shared library.
The following doesn't work

load_module ngx_wasm_module.so;

wasm {
    module on_phases on_phases.wasm;
}

http {
    init_worker_by_lua_block {
        local proxy_wasm = require "resty.http.proxy_wasm"
        local filters = {
            { name = "on_phases" },
        }

        local c_plan, err = proxy_wasm.new(filters)
        if not c_plan then
            ngx.log(ngx.ERR, err)
            return
        end

        local ok, err = proxy_wasm.load(c_plan)
        if not ok then
            ngx.log(ngx.ERR, err)
            return
        end
    }
}

How to reproduce

It can be reproduced with the following command:
export NGX_BUILD_DYNAMIC_MODULE=1 NGX_BUILD_OPENRESTY=1.21.4.1 && make && ./util/test.sh t/04-openresty/ffi/101-proxy_wasm_load.t

Which will produce the following in error.log:

2023/06/20 13:28:39 [debug] 903940#0: wasm loading plan: 000055FC6452A660
2023/06/20 13:28:39 [emerg] 903940#0: [wasm] failed linking "on_phases" module with "ngx_proxy_wasm" host interface: module not loaded <vm: "main", runtime: "wasmtime">
2023/06/20 13:28:39 [error] 903940#0: *1 [lua] init_worker_by_lua:15: unknown error, context: init_worker_by_lua*
2023/06/20 13:28:39 [debug] 903940#0: wasm freeing plan: 000055FC6452A660

Why this happens

This happens because ngx_wasm_core_init_process -- which indirectly calls ngx_wavm_module_load hasn't yet been called when ngx_wavm_module_link is indirectly invoked by ngx_http_lua_init_worker_by_inline.

This can be verified by running gdb with proper breakpoints:

gdb -ex 'layout src' -ex 'b ngx_wavm_module_load' -ex 'b ngx_wavm_module_link' -ex 'r -g "daemon off;"' work/buildroot/nginx

#0  ngx_wavm_module_link (module=0x555555872828, host=0x7ffff761dde0 <ngx_proxy_wasm_host>) at src/wasm/vm/ngx_wavm.c:685
#1  0x00007ffff75e3ee2 in ngx_wasm_ops_plan_load (plan=0x55555591b218, log=0x55555586f2f8) at src/wasm/ngx_wasm_ops.c:167
#2  0x00007ffff75eb7c1 in ngx_http_wasm_ffi_plan_load (plan=0x55555591b218) at src/common/ngx_wasm_ffi.c:85
#3  0x00007ffff7ec0fce in lj_vm_ffi_call () from work/buildroot/prefix/luajit/lib/libluajit-5.1.so.2
#4  0x00007ffff7f2012d in lj_ccall_func (L=<optimized out>, cd=<optimized out>) at lj_ccall.c:1382
#5  0x00007ffff7f3bc3d in lj_cf_ffi_meta___call (L=0x7ffff75a4380) at lib_ffi.c:230
#6  0x00007ffff7ebeb83 in lj_BC_FUNCC () from work/buildroot/prefix/luajit/lib/libluajit-5.1.so.2
#7  0x00007ffff7ed9b55 in lua_pcall (L=0x7ffff75a4380, nargs=0, nresults=0, errfunc=<optimized out>) at lj_api.c:1145
#8  0x0000555555702dc6 in ngx_http_lua_do_call (log=0x55555586f2f8, L=0x7ffff75a4380) at ../ngx_lua-0.10.21/src/ngx_http_lua_util.c:4192
#9  0x0000555555723f8c in ngx_http_lua_init_worker_by_inline (log=0x55555586f2f8, lmcf=0x55555587b570, L=0x7ffff75a4380) at ../ngx_lua-0.10.21/src/ngx_http_lua_initworkerby.c:323
#10 0x0000555555723eae in ngx_http_lua_init_worker (cycle=0x55555586f2e0) at ../ngx_lua-0.10.21/src/ngx_http_lua_initworkerby.c:296
#11 0x00005555555d9a28 in ngx_single_process_cycle (cycle=0x55555586f2e0) at src/os/unix/ngx_process_cycle.c:299
#12 0x000055555558f619 in main (argc=3, argv=0x7fffffffd558) at src/core/nginx.c:383

This happens because WasmX modules are added after OpenResty modules to cycle->modules. These are the last elements in cycle->modules at ngx_single_process_cycle:

...
$67 = (ngx_module_t *) 0x55555582bee0 <ngx_http_lua_module>
...
$92 = (ngx_module_t *) 0x55555582ea40 <ngx_stream_lua_module>
$93 = (ngx_module_t *) 0x7ffff761cae0 <ngx_wasm_module>
$94 = (ngx_module_t *) 0x7ffff761f4c0 <ngx_wasm_core_module>
$95 = (ngx_module_t *) 0x7ffff761f820 <ngx_http_wasm_module>
$96 = (ngx_module_t *) 0x7ffff76205e0 <ngx_http_wasm_filter_module>

And WasmX modules are added later because static modules are added to the modules list as soon as it's created at ngx_cycle_modules.

Why this doesn't happen when WasmX is statically linked?

When WasmX is statically compiled into Nginx, cycle->modules look like this:

$1 = (ngx_module_t *) 0x55555584c1a0 <ngx_core_module>
$2 = (ngx_module_t *) 0x55555584c6e0 <ngx_errlog_module>
$3 = (ngx_module_t *) 0x55555584d360 <ngx_conf_module>
$4 = (ngx_module_t *) 0x55555586ff40 <ngx_wasm_module>
$5 = (ngx_module_t *) 0x555555872920 <ngx_wasm_core_module>
...
$51 = (ngx_module_t *) 0x55555586d8a0 <ngx_http_lua_upstream_module>
...
$56 = (ngx_module_t *) 0x555555872c80 <ngx_http_wasm_module>
...
$68 = (ngx_module_t *) 0x555555873a40 <ngx_http_wasm_filter_module>
...
$71 = (ngx_module_t *) 0x55555586d0e0 <ngx_http_lua_module>
...
$96 = (ngx_module_t *) 0x55555586fc40 <ngx_stream_lua_module>
$97 = (ngx_module_t *) 0x555555873b20 <ngx_stream_wasm_module>

Which implicates in ngx_wasm_core_init_process to be called before ngx_http_lua_init_worker_by_inline.

Blank response if we fail to initialize

This issue is a follow-up to the conversation we had at #149, original context here.

Since we moved the initialization of the Wasm engine from the master process to the workers (to avoid fork issues with V8), loading/linking of wasm modules now happen at the workers, which initialize on-demand. As a consequence, if we get a loading/linking failure when initializing a wasm module (for example, due to an incorrect import in the wasm file or an unimplemented WASI function on our side), that causes initialization to fail at the time of the first request sent to the worker.

The issue is that, on initialization failure, we are currently producing a blank response, instead of the expected HTTP 500 sent back to the client.

Semantic of parameter `end_of_stream` for `proxy_on_http_*_headers()` differs from Envoy.

During the headers processing, ngx_wasm_module invokes the last proxy_on_http_request_headers() with the end_of_stream parameter set to true, regardless of whether a body has been sent or not. Its meaning seems to be "Has the header stream ended?"
On the other hand, when a request contains a body in Envoy's proxy-wasm, the end_of_stream parameter for proxy_on_http_request_headers() is set to false. Its meaning appears to be "Has the entire transmission finished?" In the case where no body has been sent, end_of_stream is set to true.
The same situation occurs with proxy_on_http_response_headers().

wasmx-nightly-20220912-v8-amd64-macos fails to initialize

Emiliano reported this error with the following log

thread '<unnamed>' panicked at 'called `Option::unwrap()` on a `None` value', lib/c-api/src/wasm_c_api/instance.rs:54:46
stack backtrace:
   0:        0x101f6d061 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h4f26ffad025fdbe8
   1:        0x101fc380b - core::fmt::write::h0a9937d83d3944c1
   2:        0x101f5d8a8 - std::io::Write::write_fmt::hfaf2e2e92eda8127
   3:        0x101f71547 - std::panicking::default_hook::{{closure}}::hc11e9b8d348e68b0
   4:        0x101f71155 - std::panicking::default_hook::h1d26ec4d0d63be04
   5:        0x101f71d10 - std::panicking::rust_panic_with_hook::hef4f5e524db188b3
   6:        0x101f71a14 - std::panicking::begin_panic_handler::{{closure}}::h6e8805ea2351af89
   7:        0x101f6d4d7 - std::sys_common::backtrace::__rust_end_short_backtrace::hd383ade987b76f63
   8:        0x101f7171a - _rust_begin_unwind
   9:        0x1032e288f - core::panicking::panic_fmt::hb58956db718d5b79
  10:        0x1032e27e7 - core::panicking::panic::h709cad72bd37e428
  11:        0x101745c38 - <wasmer::ordered_resolver::OrderedResolver as core::iter::traits::collect::FromIterator<wasmer::sys::externals::Extern>>::from_iter::hc77b09aee0e50a2c
  12:        0x10171f116 - _wasm_instance_new
  13:        0x1017107fc - _ngx_v8_init_instance
  14:        0x1017076ae - _ngx_wavm_instance_create
  15:        0x10170c40b - _ngx_proxy_wasm_filter_init
  16:        0x1017050fc - _ngx_wasm_ops_engine_init
  17:        0x101710fb8 - _ngx_http_wasm_init_process
  18:        0x101677c5f - _ngx_single_process_cycle
  19:        0x10164ab60 - _main
fatal runtime error: failed to initiate panic, error 5
[1]    54740 abort      ./nginx```

Memory corruption when dispatch socket outlives request

I've been observing frequent segfaults when running the datakit filter on a kong/kong:nightly container which uses ngx_wasm_module commit d5e4c69 and a filter configuration that dispatches multiple HTTP subrequests. (I could not reproduce the problem on a configuration with a single HTTP dispatch).

This was one of the segfaults:

``` Program received signal SIGSEGV, Segmentation fault. 0x000057f6f83fade1 in ngx_palloc_large (pool=pool@entry=0x57f6fa0e2710, size=size@entry=1120) at src/core/ngx_palloc.c:244 244 src/core/ngx_palloc.c: No such file or directory. (gdb) bt #0 0x000057f6f83fade1 in ngx_palloc_large (pool=pool@entry=0x57f6fa0e2710, size=size@entry=1120) at src/core/ngx_palloc.c:244 #1 0x000057f6f83fafdf in ngx_palloc (pool=pool@entry=0x57f6fa0e2710, size=size@entry=1120) at src/core/ngx_palloc.c:131 #2 0x00007723db389217 in ngx_list_init (size=56, n=20, pool=0x57f6fa0e2710, list=0x57f6f9f32ec0) at src/core/ngx_list.h:39 #3 ngx_wasm_socket_read_http_response (sock=0x57f6f9f328e8, bytes=539, ctx=0x57f6f9f32cc8) at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/ngx_wasm_socket_tcp.c:968 #4 0x00007723db3894ba in ngx_wasm_socket_tcp_read (sock=sock@entry=0x57f6f9f328e8, reader=0x7723db389150 , reader_ctx=reader_ctx@entry=0x57f6f9f32cc8) at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/ngx_wasm_socket_tcp.c:1064 #5 0x00007723db39b934 in ngx_http_proxy_wasm_dispatch_resume_handler (sock=) at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/http/proxy_wasm/ngx_http_proxy_wasm_dispatch.c:785 #6 0x00007723db387f1f in ngx_wasm_socket_tcp_resume (sock=) at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/ngx_wasm_socket_tcp.c:90 #7 ngx_wasm_socket_tcp_resume (sock=) at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/ngx_wasm_socket_tcp.c:76 #8 0x000057f6f8425a6f in ngx_epoll_process_events (cycle=, timer=, flags=) at src/event/modules/ngx_epoll_module.c:901 #9 0x000057f6f841b79a in ngx_process_events_and_timers (cycle=cycle@entry=0x57f6f87991b0) at src/event/ngx_event.c:258 #10 0x000057f6f842384e in ngx_worker_process_cycle (cycle=cycle@entry=0x57f6f87991b0, data=data@entry=0x0) at src/os/unix/ngx_process_cycle.c:793 #11 0x000057f6f8421c0f in ngx_spawn_process (cycle=cycle@entry=0x57f6f87991b0, proc=0x57f6f84237d0 , data=0x0, name=0x57f6f85f4004 "worker process", respawn=respawn@entry=0) at src/os/unix/ngx_process.c:199 #12 0x000057f6f8424a10 in ngx_reap_children (cycle=0x57f6f87991b0) at src/os/unix/ngx_process_cycle.c:665 #13 ngx_master_process_cycle (cycle=0x57f6f87991b0) at src/os/unix/ngx_process_cycle.c:180 #14 0x000057f6f83f884e in main (argc=, argv=) at src/core/nginx.c:387 ```

I got others with different backtraces, which suggested an earlier memory corruption.

I installed Valgrind in the container and got a ton on memory errors, all which seem to follow from the same cause as the first one:

``` 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 00000000056B6A50) 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_headers" step in "header_filter" phase 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase 172.17.0.1 - - [09/Apr/2024:21:56:42 +0000] "GET /demo HTTP/1.1" 404 648 "-" "HTTPie/2.6.0" kong_request_id: "0dcdce20288d1e0b60aee81a85b68465" 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_log" step in "log" phase 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_done" step in "done" phase 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 finalizing context 2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm] "datakit_filter" filter freeing context #1 (1/1) 2024/04/09 21:56:42 [info] 1321#0: *645 client 172.17.0.1 closed keepalive connection ==1321== Invalid read of size 8 ==1321== at 0x5D21811: ngx_http_proxy_wasm_dispatch_resume_handler (ngx_http_proxy_wasm_dispatch.c:708) ==1321== by 0x5D0DF1E: ngx_wasm_socket_tcp_resume (ngx_wasm_socket_tcp.c:90) ==1321== by 0x5D0DF1E: ngx_wasm_socket_tcp_resume (ngx_wasm_socket_tcp.c:76) ==1321== by 0x177A6E: ngx_epoll_process_events (ngx_epoll_module.c:901) ==1321== by 0x16D799: ngx_process_events_and_timers (ngx_event.c:258) ==1321== by 0x17584D: ngx_worker_process_cycle (ngx_process_cycle.c:793) ==1321== by 0x173C0E: ngx_spawn_process (ngx_process.c:199) ==1321== by 0x174F0F: ngx_start_worker_processes (ngx_process_cycle.c:382) ==1321== by 0x1762B3: ngx_master_process_cycle (ngx_process_cycle.c:135) ==1321== by 0x14A84D: main (nginx.c:387) ==1321== Address 0x6fd9260 is 2,704 bytes inside a block of size 4,096 free'd ==1321== at 0x484B27F: free (in /usr/libexec/valgrind/vgpreload_memcheck-amd64-linux.so) ==1321== by 0x14CF36: ngx_destroy_pool (ngx_palloc.c:90) ==1321== by 0x18FF2C: ngx_http_free_request (ngx_http_request.c:3771) ==1321== by 0x190B53: ngx_http_set_keepalive (ngx_http_request.c:3151) ==1321== by 0x190B53: ngx_http_finalize_connection.part.0 (ngx_http_request.c:2788) ==1321== by 0x1A24C1: ngx_http_upstream_handler (ngx_http_upstream.c:1309) ==1321== by 0x177A6E: ngx_epoll_process_events (ngx_epoll_module.c:901) ==1321== by 0x16D799: ngx_process_events_and_timers (ngx_event.c:258) ==1321== by 0x17584D: ngx_worker_process_cycle (ngx_process_cycle.c:793) ==1321== by 0x173C0E: ngx_spawn_process (ngx_process.c:199) ==1321== by 0x174F0F: ngx_start_worker_processes (ngx_process_cycle.c:382) ==1321== by 0x1762B3: ngx_master_process_cycle (ngx_process_cycle.c:135) ==1321== by 0x14A84D: main (nginx.c:387) ==1321== Block was alloc'd at ==1321== at 0x484DE30: memalign (in /usr/libexec/valgrind/vgpreload_memcheck-amd64-linux.so) ==1321== by 0x484DF92: posix_memalign (in /usr/libexec/valgrind/vgpreload_memcheck-amd64-linux.so) ==1321== by 0x1711BA: ngx_memalign (ngx_alloc.c:57) ==1321== by 0x14CC5A: ngx_palloc_block (ngx_palloc.c:186) ==1321== by 0x14D195: ngx_pcalloc (ngx_palloc.c:302) ==1321== by 0x18F393: ngx_http_alloc_request (ngx_http_request.c:622) ==1321== by 0x18F814: ngx_http_create_request (ngx_http_request.c:534) ==1321== by 0x192D81: ngx_http_wait_request_handler (ngx_http_request.c:516) ==1321== by 0x177A6E: ngx_epoll_process_events (ngx_epoll_module.c:901) ==1321== by 0x16D799: ngx_process_events_and_timers (ngx_event.c:258) ==1321== by 0x17584D: ngx_worker_process_cycle (ngx_process_cycle.c:793) ==1321== by 0x173C0E: ngx_spawn_process (ngx_process.c:199) ==1321== ```

ngx_wasm_socket_tcp_resume triggers the ngx_http_proxy_wasm_dispatch_resume_handler, but the request data has already been destroyed by ngx_http_free_request.

The invalid data in the Valgrind trace above is call->pwexec, where call comes from sock->data, which is the input argument of the function.

The deallocation at ngx_http_free_request is ngx_destroy_pool for r->pool.

One weird behavior quirk I noticed in the logs before the first Valgrind error is that I did get two call: run lines (which trigger in my filter from on_http_request_headers but only one call: resume line (which triggers from on_http_call_response). The resume operation seems delayed until after ngx_http_free_request already destroyed the main request's data, taking the pwexec context with it.

This is unexpected because in normal operation the filter should be paused waiting for the last dispatch to finish.

Normally, in my local build, I get this sequence of operations for two dispatch calls:

Data for first dispatch arrives:

2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket handler (wev: 0)
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket receive handler for "127.0.0.1:6502"
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket trying to receive data (max: 1024)
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming http response reading with 243 bytes to parse
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming http response reading with 226 bytes to parse
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm reuse free buf memory 103 >= 69, cl:00005A8EFF7BA8D0, p:00005A8F001E1FB0
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm allocate new chainlink and new buf of size 69, cl: 00005A8EFFC30520, buf: 00005A8EFFC30560
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket reading done
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket closing

First dispatch resumes:

2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_dispatch_response" step in "background" phase
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] DataKitFilter: on http call response, id = 0
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] call: resume
2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm_alloc: 1376256:1343312:69
2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm_alloc: 1376256:1343440:16

it PAUSEs at the end because there's another one pending:

2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm more http dispatch calls pending...
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm] setting next action: pwctx->action = "PAUSE" (pwctx: 00005A8F014EDF00)

Data for second dispatch arrives:

2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket handler (wev: 0)
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket receive handler for "127.0.0.1:8008"
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket trying to receive data (max: 1024)
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming http response reading with 200 bytes to parse
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket resuming http response reading with 183 bytes to parse
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm allocate new chainlink and new buf of size 26, cl: 00005A8F01586240, buf: 00005A8EFF866290
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm allocate new chainlink and new buf of size 26, cl: 00005A8F0048BE80, buf: 00005A8F0048BEC0
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket reading done
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm tcp socket closing

second dispatch resumes:

2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_dispatch_response" step in "background" phase
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] DataKitFilter: on http call response, id = 1
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] call: resume
2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm_alloc: 1376256:1343440:26
2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm_alloc: 1376256:1346416:16
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] template: run
2024/04/09 19:45:56 [debug] 1600878#0: *2 wasm allocate new chainlink and new buf of size 56, cl: 00005A8EFFCA3D30, buf: 00005A8EFFCA3D70

done, it sets CONTINUE and the next phases (on_response_headers etc.) continue:

2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm last http dispatch call handled
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm] setting next action: pwctx->action = "CONTINUE" (pwctx: 00005A8F014EDF00)
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 00005A8F014EDF00)
2024/04/09 19:45:56 [debug] 1600878#0: *2 proxy_wasm return action: "CONTINUE"
2024/04/09 19:45:56 [debug] 1600878#0: *2 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_headers" step in "header_filter" phase

Contrast to what happens in the crashing case:

( wasm tcp socket logs and the like did not show up in the container logs for some reason)

I trigger two calls, with dispatch ids 0 ond 1:

2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] call: run
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm] setting next action: pwctx->action = "PAUSE" (pwctx: 00000000056B6A50)
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] call: dispatch call id: 0
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] call: run
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm] setting next action: pwctx->action = "PAUSE" (pwctx: 00000000056B6A50)
2024/04/09 21:56:41 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] call: dispatch call id: 1

The first one resumes, but CONTINUE is set right away.

2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_dispatch_response" step in "background" phase
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm] setting next action: pwctx->action = "CONTINUE" (pwctx: 00000000056B6A50)
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] DataKitFilter: on http call response, id = 0
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] call: resume
2024/04/09 21:56:42 [debug] 1321#0: *645 [lua] init.lua:1362: balancer(): setting address (try 1): 93.184.216.34:80
2024/04/09 21:56:42 [debug] 1321#0: *645 [lua] init.lua:1395: balancer(): enabled connection keepalive (pool=93.184.216.34|80, pool_size=512, idle_timeout=60, max_requests=10000)
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 00000000056B6A50)

This causes all other phases to run and the filter context to be freed.

2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_headers" step in "header_filter" phase
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_response_body" step in "body_filter" phase
172.17.0.1 - - [09/Apr/2024:21:56:42 +0000] "GET /demo HTTP/1.1" 404 648 "-" "HTTPie/2.6.0" kong_request_id: "0dcdce20288d1e0b60aee81a85b68465"
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_log" step in "log" phase
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 resuming "on_done" step in "done" phase
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm]["datakit_filter" #1] filter 1/1 finalizing context
2024/04/09 21:56:42 [debug] 1321#0: *645 [proxy-wasm] "datakit_filter" filter freeing context #1 (1/1)
2024/04/09 21:56:42 [info] 1321#0: *645 client 172.17.0.1 closed keepalive connection

And then when the resume handler runs, it's operating on dead data:

==1321== Invalid read of size 8
==1321==    at 0x5D21811: ngx_http_proxy_wasm_dispatch_resume_handler (ngx_http_proxy_wasm_dispatch.c:708)

This seems to indicate that the crash is caused by something being scheduled out-of-order earlier on (rather than something like "this piece of data is allocated with a pool with the wrong lifetime").

full_log.txt

need support for runtimes with differently named libraries

The current config system assumes that the runtime's name ($ngx_wasm_runtime_name) is the same as the library name. This need not be true for some runtime - for eg WAMR's embeddable library is libiwasm.so.

In this situation we need to use $NGX_WASM_RUNTIME_LD_OPT to supply both the path(-L) and library names(-l). There should be a cleaner mechanism.

Fix SIGFPE handling failure with Wasmtime in HUP mode

Occasionally our Large CI matrix will fail in Wasmtime+HUP mode because of our divide-by-zero test. It seems like the SIGFPE is not being caught by Wasmtime, perhaps because the signal disposition isn't properly setup in reloaded processes?

See: https://github.com/Kong/ngx_wasm_module/actions/runs/6233460791/job/16918758835

t/02-http/directives/001-wasm_call_directive.t TEST 11: wasm_call directive - catch runtime error sanity
==13054== 
==13054== Process terminating with default action of signal 8 (SIGFPE)
==13054==  Integer divide by zero at address 0x1004290527
==13054==    at 0x4F92023: ???
==13054==    by 0x4F92027: ???
==13054==    by 0x4F92027: ???
==13054==    by 0x57C6105: wasmtime_setjmp_12_0_1 (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x50F72AD: wasmtime_runtime::traphandlers::<impl wasmtime_runtime::traphandlers::call_thread_state::CallThreadState>::with (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x51BCDAD: wasmtime_runtime::traphandlers::catch_traps (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x50E9BC1: _ZN8wasmtime4func27invoke_wasm_and_catch_traps17hbc40715a7c54e5e3E.llvm.8324662194832750297 (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x50EC7CB: _ZN8wasmtime4func4Func9call_impl17h774ce661bab89722E.llvm.8324662194832750297 (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x51CCCE6: wasmtime_func_call (in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/runtimes/wasmtime-12.0.1/lib/libwasmtime.so)
==13054==    by 0x1FAB27: ngx_wasmtime_call (ngx_wrt_wasmtime.c:639)
==13054==    by 0x1EA27F: ngx_wavm_func_call (ngx_wavm.c:1107)
==13054==    by 0x1EA27F: ngx_wavm_instance_call_func_vec (ngx_wavm.c:1237)
==13054==    by 0x1E6043: ngx_wasm_op_call_handler (ngx_wasm_ops.c:356)
==13054==    by 0x1E686C: ngx_wasm_ops_resume (ngx_wasm_ops.c:293)
==13054==    by 0x1FC5C9: ngx_http_wasm_rewrite_handler (ngx_http_wasm_module.c:675)
==13054==    by 0x17AFE6: ngx_http_core_rewrite_phase (ngx_http_core_module.c:929)
==13054==    by 0x176B8C: ngx_http_core_run_phases (ngx_http_core_module.c:875)
==13054==    by 0x183173: ngx_http_process_request_headers (ngx_http_request.c:1529)
==13054==    by 0x1835EE: ngx_http_process_request_line (ngx_http_request.c:1196)
==13054==    by 0x16640A: ngx_epoll_process_events (ngx_epoll_module.c:901)
==13054==    by 0x15A9A3: ngx_process_events_and_timers (ngx_event.c:248)
==13054==    by 0x163B54: ngx_worker_process_cycle (ngx_process_cycle.c:721)
==13054==    by 0x1621F1: ngx_spawn_process (ngx_process.c:199)
==13054==    by 0x1633D7: ngx_start_worker_processes (ngx_process_cycle.c:344)
==13054==    by 0x164C41: ngx_master_process_cycle (ngx_process_cycle.c:234)
==13054==    by 0x13721E: main (nginx.c:384)

Github Actions: mismatch with work/ dir causes V8 to never be cached

util/runtimes/v8.sh manages the caching of its assets, aiming to take advantage of the "Setup cache - work/ dir" step performed by the Github Actions ci.yml workflow.

However, there seems to be a confusion regarding the work directory: there are currently two work/ directories: one at the root of the runner's $HOME directory, and another one inside the repo checkout used by the runner.

v8.sh was being launched with the former by the wasm-runtime action and using the latter for caching via DIR_DOWNLOAD, resulting in a situation like this:

caching built assets in /home/runner/work/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18...
'/home/runner/work/v8-10.5.18/include/wasm.h' -> '/home/runner/work/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18/wasm.h'
'/home/runner/work/v8-10.5.18/lib/libwee8.a' -> '/home/runner/work/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18/libwee8.a'
'/home/runner/work/v8-10.5.18/include/cwabt.h' -> '/home/runner/work/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18/cwabt.h'
'/home/runner/work/v8-10.5.18/lib/libcwabt.a' -> '/home/runner/work/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18/libcwabt.a'

However, these don't seem to be found in subsequent runs, so the "Setup cache" CI step, which uses plain work/downloads, is presumably only caching the top-level one?...

I can do some simple tweaks to move the cached V8 assets around, but I thought I'd write down this summary of the situation first in case there's something wrong going on with the caching of work/ in general that might need to be fixed instead.

segfault during ngx_proxy_wasm_ctx_lookup

Running kong/kong:nightly container (SHA kong/kong@sha256:efaa8af02211956a69fc180f6cffa1fcf64f45a44297f21c192a077087030fb9 ) with the coraza-proxy-wasm filter, I am getting a consistent segfault by triggering two proxy requests: the first one fails with cannot pause in "body_filter" phase ("on_response_body" step), the second one segfaults.

I haven't tried it with the latest ngx_wasm_module code yet; I will add more info to this issue as I gather it. No detailed repro steps at the moment since my coraza-proxy-wasm testing environment is quite tweaked at the moment.

docker-compose logs:

WARNING: Found orphan containers (kong-service-provisioner-1) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up.
Starting kong_httpbin_1 ... done
Recreating kong_kong_1  ... done
Attaching to kong_httpbin_1, kong_kong_1
httpbin_1  | time="2023-11-30T18:10:13.5321" msg="go-httpbin listening on http://0.0.0.0:8080"
kong_1     | 2023/11/30 18:10:14 [warn] 1#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /var/run/kong/nginx.conf:8
kong_1     | nginx: [warn] the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /var/run/kong/nginx.conf:8
kong_1     | 2023/11/30 18:10:14 [notice] 1#0: [wasm] swapping modules: "ngx_http_headers_more_filter_module" (index: 61) and "ngx_http_wasm_filter_module" (index: 90)
kong_1     | 2023/11/30 18:10:14 [notice] 1#0: [wasm] swapping modules: "ngx_http_lua_module" (index: 60) and "ngx_wasm_core_module" (index: 88)
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: using the "epoll" event method
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: openresty/1.21.4.2
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: OS: Linux 6.2.0-37-generic
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: getrlimit(RLIMIT_NOFILE): 1048576:1048576
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker processes
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1256
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1257
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1258
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1259
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1260
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1261
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1262
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1263
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1264
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1265
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1266
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1267
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1268
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1269
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1270
kong_1     | 2023/11/30 18:10:20 [notice] 1#0: start worker process 1271
kong_1     | 2023/11/30 18:10:26 [notice] 1263#0: *1 [lua] globalpatches.lua:73: sleep(): executing a blocking 'sleep' (0.001 seconds), context: init_worker_by_lua*
kong_1     | 2023/11/30 18:10:26 [notice] 1263#0: *1 [lua] globalpatches.lua:73: sleep(): executing a blocking 'sleep' (0.002 seconds), context: init_worker_by_lua*
kong_1     | 2023/11/30 18:10:26 [notice] 1265#0: *2 [lua] init.lua:259: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
kong_1     | 2023/11/30 18:10:26 [notice] 1265#0: *2 [lua] init.lua:259: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
kong_1     | 2023/11/30 18:10:26 [notice] 1265#0: *2 [kong] init.lua:522 declarative config loaded from /opt/kong/kong.yaml, context: init_worker_by_lua*
kong_1     | 2023/11/30 18:10:26 [notice] 1260#0: *5 [lua] globalpatches.lua:73: sleep(): executing a blocking 'sleep' (0.001 seconds), context: init_worker_by_lua*
kong_1     | 2023/11/30 18:14:32 [crit] 1256#0: *2369 [proxy-wasm]["main" #1] /%!(EXTRA T=GET, T=HTTP/1.1), client: 172.18.0.1, server: kong, request: "GET / HTTP/1.1", host: "localhost:8000"
httpbin_1  | time="2023-11-30T18:14:33.5486" status=200 method="GET" uri="/" size_bytes=11133 duration_ms=0.31 user_agent="HTTPie/2.6.0" client_ip=172.18.0.1
kong_1     | 2023/11/30 18:14:33 [error] 1256#0: *2369 [proxy-wasm]["main" #1] filter 1/1 cannot pause in "body_filter" phase ("on_response_body" step) while sending to client, client: 172.18.0.1, server: kong, request: "GET / HTTP/1.1", upstream: "http://172.18.0.2:8080/", host: "localhost:8000"
kong_1     | 2023/11/30 18:14:33 [warn] 1256#0: *2369 [proxy-wasm]["main" #1] filter 1/1 failed resuming "on_response_body" step in "body_filter" phase (not yieldable) while sending to client, client: 172.18.0.1, server: kong, request: "GET / HTTP/1.1", upstream: "http://172.18.0.2:8080/", host: "localhost:8000"
kong_1     | 172.18.0.1 - - [30/Nov/2023:18:14:33 +0000] "GET / HTTP/1.1" 200 11153 "-" "HTTPie/2.6.0"
kong_1     | 2023/11/30 18:14:35 [error] 1269#0: *310 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1259#0: *1332 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1268#0: *756 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1270#0: *1482 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1266#0: *2223 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1262#0: *1791 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1267#0: *456 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1258#0: *748 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1261#0: *1770 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1271#0: *758 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1264#0: *1478 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1257#0: *752 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1260#0: *460 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1263#0: *164 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [error] 1265#0: *18 [lua] worker.lua:266: communicate(): event worker failed: failed to receive the header bytes: closed, context: ngx.timer
kong_1     | 2023/11/30 18:14:35 [notice] 1#0: signal 17 (SIGCHLD) received from 1256
kong_1     | 2023/11/30 18:14:35 [alert] 1#0: worker process 1256 exited on signal 11 (core dumped)
kong_1     | 2023/11/30 18:14:35 [notice] 1#0: start worker process 2004
kong_1     | 2023/11/30 18:14:35 [notice] 1#0: signal 29 (SIGIO) received
kong_1     | 2023/11/30 18:14:35 [warn] 2004#0: found and cleared 1 stale readers from LMDB

Backtrace from within the container:

Program terminated with signal SIGSEGV, Segmentation fault.
#0  ngx_proxy_wasm_ctx_lookup (id=2, ictx=0x56255ad16db8)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c:95
95	/home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c: No such file or directory.
(gdb) bt
#0  ngx_proxy_wasm_ctx_lookup (id=2, ictx=0x56255ad16db8)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c:95
#1  ngx_proxy_wasm_on_start (ictx=ictx@entry=0x56255ad16db8, filter=filter@entry=0x56255ad16800, start=start@entry=0)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c:1507
#2  0x00007f09b78d8356 in ngx_proxy_wasm_run_step (pwexec=pwexec@entry=0x56255ad13fe8, 
    ictx=ictx@entry=0x56255ad16db8, step=step@entry=NGX_PROXY_WASM_STEP_REQ_HEADERS)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c:660
#3  0x00007f09b78d8a86 in ngx_proxy_wasm_resume (pwctx=0x56255b4b7500, phase=<optimized out>, 
    step=NGX_PROXY_WASM_STEP_REQ_HEADERS)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/proxy_wasm/ngx_proxy_wasm.c:785
#4  0x00007f09b78d035b in ngx_wasm_ops_resume (ctx=0x56255b1a9448, phaseidx=<optimized out>)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/wasm/ngx_wasm_ops.c:293
#5  0x00007f09b78e1d02 in ngx_http_wasm_ffi_start (r=0x56255b763020)
    at /home/runner/.cache/bazel/_bazel_runner/5f5cb99aeb7e2b4db19484ea5022e949/execroot/kong/external/ngx_wasm_module/src/common/lua/ngx_wasm_lua_ffi.c:164
#6  0x00007f09b92d1f89 in lj_vm_ffi_call () at buildvm_x86.dasc:2704
#7  0x00007f09b931fbe4 in lj_ccall_func (L=<optimized out>, cd=<optimized out>) at lj_ccall.c:1382
#8  0x00007f09b9337435 in lj_cf_ffi_meta___call (L=0x7f09a1b69290) at lib_ffi.c:230
#9  0x00007f09b92cfb3b in lj_BC_FUNCC () at buildvm_x86.dasc:859
#10 0x0000562558276146 in ngx_http_lua_run_thread (L=L@entry=0x7f09b7597380, r=r@entry=0x56255b763020, 
    ctx=ctx@entry=0x56255b763d78, nrets=<optimized out>, nrets@entry=0)
    at ../ngx_lua-0.10.25/src/ngx_http_lua_util.c:1184
#11 0x000056255827c2bb in ngx_http_lua_access_by_chunk (L=0x7f09b7597380, r=0x56255b763020)
    at ../ngx_lua-0.10.25/src/ngx_http_lua_accessby.c:337
#12 0x00005625581e08d1 in ngx_http_core_access_phase (r=0x56255b763020, ph=0x5625598bf678)
    at src/http/ngx_http_core_module.c:1110
#13 0x00005625581dbfcd in ngx_http_core_run_phases (r=0x56255b763020) at src/http/ngx_http_core_module.c:885
#14 0x00005625581e7917 in ngx_http_process_request_headers (rev=rev@entry=0x56255a8fb230)
    at src/http/ngx_http_request.c:1498
#15 0x00005625581e7d39 in ngx_http_process_request_line (rev=0x56255a8fb230) at src/http/ngx_http_request.c:1165
#16 0x00005625581cdf0f in ngx_epoll_process_events (cycle=<optimized out>, timer=<optimized out>, 
    flags=<optimized out>) at src/event/modules/ngx_epoll_module.c:901
#17 0x00005625581c3d0a in ngx_process_events_and_timers (cycle=cycle@entry=0x56255895bac0)
    at src/event/ngx_event.c:257
#18 0x00005625581cbcfe in ngx_worker_process_cycle (cycle=cycle@entry=0x56255895bac0, data=data@entry=0x0)
    at src/os/unix/ngx_process_cycle.c:793
#19 0x00005625581ca08f in ngx_spawn_process (cycle=cycle@entry=0x56255895bac0, 
    proc=proc@entry=0x5625581cbc80 <ngx_worker_process_cycle>, data=data@entry=0x0, 
    name=name@entry=0x5625582f7d5c "worker process", respawn=respawn@entry=-3) at src/os/unix/ngx_process.c:199
#20 0x00005625581cb390 in ngx_start_worker_processes (cycle=cycle@entry=0x56255895bac0, n=16, type=type@entry=-3)
    at src/os/unix/ngx_process_cycle.c:382
#21 0x00005625581cc754 in ngx_master_process_cycle (cycle=0x56255895bac0) at src/os/unix/ngx_process_cycle.c:135
#22 0x00005625581a15ee in main (argc=<optimized out>, argv=<optimized out>) at src/core/nginx.c:386

[arm64] Tests fail with `wasmtime` and `wasmer` runtime on linux-arm64

make test fails on arm64 (Ubuntu 22.04 on UTM on macOS on M1), with different errors on wasmtime and wasmer runtimes. (at commit f355000)

wasmtime:

  • Build info
nginx version: nginx/1.21.6 (ngx_wasm_module [dev debug wasmtime])
built by gcc 11.2.0 (Ubuntu 11.2.0-19ubuntu1)
configure arguments: --build='ngx_wasm_module [dev debug wasmtime]' --builddir=/home/zhy/Code/ngx_wasm_module/work/buildroot --prefix=/home/zhy/Code/ngx_wasm_module/t/servroot --with-cc-opt='-O0 -ggdb3 -gdwarf' --with-ld-opt=' -Wl,-rpath,/home/zhy/Binary/wasmtime-v0.38.1-aarch64-linux-c-api/lib' --with-poll_module --with-debug --add-module=/home/zhy/Code/ngx_wasm_module --add-dynamic-module=/home/zhy/Code/ngx_wasm_module/work/downloads/echo-nginx-module --add-dynamic-module=/home/zhy/Code/ngx_wasm_module/work/downloads/headers-more-nginx-module
  • Test report
Test Summary Report
-------------------
t/02-http/001-wasm_call_directive.t                   (Wstat: 512 Tests: 72 Failed: 2)
  Failed tests:  69-70
  Non-zero exit status: 2
t/02-http/hfuncs/002-resp_set_status.t                (Wstat: 256 Tests: 6 Failed: 1)
  Failed test:  2
  Non-zero exit status: 1
t/02-http/hfuncs/003-say.t                            (Wstat: 256 Tests: 12 Failed: 1)
  Failed test:  2
  Non-zero exit status: 1
t/03-proxy_wasm/006-instance_lifecycle.t              (Wstat: 0 Tests: 34 Failed: 12)
  Failed tests:  1-2, 5-7, 10-11, 14-16, 33-34
  Parse errors: Bad plan.  You planned 32 tests but ran 34.
t/03-proxy_wasm/102-proxy_send_local_response.t       (Wstat: 0 Tests: 109 Failed: 12)
  Failed tests:  19-21, 25-26, 28, 37-38, 40-41, 106, 109
  Parse errors: Bad plan.  You planned 108 tests but ran 109.
t/03-proxy_wasm/111-proxy_set_http_request_header.t   (Wstat: 768 Tests: 75 Failed: 3)
  Failed tests:  26-28
  Non-zero exit status: 3
t/03-proxy_wasm/114-proxy_set_http_request_body.t     (Wstat: 768 Tests: 35 Failed: 3)
  Failed tests:  16-18
  Non-zero exit status: 3
t/03-proxy_wasm/115-proxy_get_http_response_body.t    (Wstat: 512 Tests: 30 Failed: 2)
  Failed tests:  11-12
  Non-zero exit status: 2
t/03-proxy_wasm/116-proxy_set_http_response_body.t    (Wstat: 1024 Tests: 49 Failed: 4)
  Failed tests:  15, 17-19
  Non-zero exit status: 4
t/03-proxy_wasm/130-proxy_dispatch_http_call.t        (Wstat: 7424 Tests: 120 Failed: 29)
  Failed tests:  3, 17-19, 21-22, 24-27, 67, 69-71, 73-75
                85-87, 89-91, 93-95, 117-119
  Non-zero exit status: 29
t/03-proxy_wasm/131-proxy_http_dispatch_timeouts.t    (Wstat: 256 Tests: 16 Failed: 1)
  Failed test:  15
  Non-zero exit status: 1
Files=36, Tests=1621, 925 wallclock secs ( 0.85 usr  0.34 sys +  9.49 cusr  6.40 csys = 17.08 CPU)
Result: FAIL

wasmer (with singlepass backend):

  • Build info
nginx version: nginx/1.21.6 (ngx_wasm_module [dev debug wasmer])
built by gcc 11.2.0 (Ubuntu 11.2.0-19ubuntu1)
configure arguments: --build='ngx_wasm_module [dev debug wasmer]' --builddir=/home/zhy/Code/ngx_wasm_module/work/buildroot --prefix=/home/zhy/Code/ngx_wasm_module/t/servroot --with-cc-opt='-O0 -ggdb3 -gdwarf' --with-ld-opt=' -Wl,-rpath,/home/zhy/Binary/wasmer/lib' --with-poll_module --with-debug --add-module=/home/zhy/Code/ngx_wasm_module --add-dynamic-module=/home/zhy/Code/ngx_wasm_module/work/downloads/echo-nginx-module --add-dynamic-module=/home/zhy/Code/ngx_wasm_module/work/downloads/headers-more-nginx-module
  • Test report
Test Summary Report
-------------------
t/03-proxy_wasm/130-proxy_dispatch_http_call.t        (Wstat: 1792 Tests: 120 Failed: 7)
  Failed tests:  17-19, 41-43, 99
  Non-zero exit status: 7
t/03-proxy_wasm/131-proxy_http_dispatch_timeouts.t    (Wstat: 0 Tests: 17 Failed: 5)
  Failed tests:  9-12, 17
  Parse errors: Bad plan.  You planned 16 tests but ran 17.
Files=36, Tests=1619, 387 wallclock secs ( 1.12 usr  0.27 sys + 10.41 cusr  7.27 csys = 19.07 CPU)
Result: FAIL

wasmer (with cranelift backend):

  • Test report
Test Summary Report
-------------------
t/03-proxy_wasm/006-instance_lifecycle.t              (Wstat: 512 Tests: 32 Failed: 2)
  Failed tests:  10, 18
  Non-zero exit status: 2
t/03-proxy_wasm/102-proxy_send_local_response.t       (Wstat: 256 Tests: 108 Failed: 1)
  Failed test:  46
  Non-zero exit status: 1
t/03-proxy_wasm/131-proxy_http_dispatch_timeouts.t    (Wstat: 0 Tests: 17 Failed: 5)
  Failed tests:  1-4, 17
  Parse errors: Bad plan.  You planned 16 tests but ran 17.
t/03-proxy_wasm/130-proxy_dispatch_http_call.t        (Wstat: 1792 Tests: 120 Failed: 7)
  Failed tests:  33-35, 105-107, 119
  Non-zero exit status: 7
Files=36, Tests=1619, 149 wallclock secs ( 0.26 usr  0.11 sys +  6.66 cusr  3.19 csys = 10.22 CPU)
Result: FAIL

Deprecated warnings in GitHub Actions

Lately a couple of warnings are appearing in our GitHub Actions annotations:

Node.js 12 actions are deprecated. For more information see: https://github.blog/changelog/2022-09-22-github-actions-all-actions-will-begin-running-on-node16-instead-of-node12/. Please update the following actions to use Node.js 16: actions/checkout, actions/cache, actions/cache, actions/checkout

The save-state/set-output command[s] [are] deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/

This issue is opened as a reminder to address those before they are fully EOL'd and bite us. Help welcome!

Go

Add support for Go

Fix V8 builds in ArchLinux build image

Can be reproduced locally with ./util/release.sh --bin-all archv8 (temporarily rename or remove all release Dockerfiles but Dockerfile.amd64.archlinux to only build an ArchLinux release).

Example in a recent release job:
https://github.com/Kong/ngx_wasm_module/actions/runs/5173873700/jobs/9320127759

Opened an issue with V8 but no interest: https://bugs.chromium.org/p/v8/issues/detail?id=14031 I wonder if some of our gn options might be the cause. Yet on my local Manjaro (ArchLinux based distro), I encounter no problem building V8.

Tasks

No tasks being tracked yet.

coraza-proxy-wasm support

I was trying to see if https://github.com/corazawaf/coraza-proxy-wasm would work with this module.

a wasm module can be downloaded from: https://github.com/corazawaf/coraza-proxy-wasm/releases/download/0.4.0/coraza-proxy-wasm-0.4.0.zip

error.log file:

2023/11/30 12:10:01 [error] 63932#0: *1 [wasm] error while executing at wasm backtrace:
    0: 0x1db40d - <unknown>!(*github.com/corazawaf/coraza-proxy-wasm/wasmplugin.wafMetrics).incrementCounter
    1: 0x1d98ea - <unknown>!proxy_on_request_headers
note: using the `WASMTIME_BACKTRACE_DETAILS=1` environment variable may show more debugging information

Caused by:
    host trap (function not yet implemented): proxy_define_metric <module: "coraza", vm: "main", runtime: "wasmtime">

Creating this issue to track completeness and compatibility with a more complicated proxy-wasm filter such as coraza-proxy-wasm.

nginx config:

daemon off;
worker_processes  auto;
master_process    off;

events {
    worker_connections  2048;
}

wasm {
    module coraza coraza-proxy-wasm.wasm;
}

http {
    server {
        listen 9000;

        location / {
            proxy_wasm coraza;
            proxy_pass http://127.0.0.1:8000/;
        }
    }

    server {
        listen 8000;
        location / {
            return 200 "Hello, World!";
        }
    }
}

sending a request to nginx:

curl -v localhost:9000
*   Trying 127.0.0.1:9000...
* Connected to localhost (127.0.0.1) port 9000 (#0)
> GET / HTTP/1.1
> Host: localhost:9000
> User-Agent: curl/8.1.2
> Accept: */*
> 
< HTTP/1.1 500 Internal Server Error
< Content-Type: text/html
< Content-Length: 177
< Connection: close
< Server: nginx/1.25.3
< Date: Thu, 30 Nov 2023 20:10:01 GMT
< 
<html>
<head><title>500 Internal Server Error</title></head>
<body>
<center><h1>500 Internal Server Error</h1></center>
<hr><center>nginx/1.25.3</center>
</body>
</html>
* Closing connection 0

V8 trap handler support in ARM

I've got the following message when running wasmx backed by v8 in linux-arm64:
nginx: [error] [wasm] failed to enable v8 trap handler

We call v8::V8::EnableWebAssemblyTrapHandler from V8 API to enable trap handler support, which then calls EnableTrapHandler.

EnableTrapHandler will fail (return false) either if V8 wasn't built with trap handler support -- which seems to be defined here; or if it's called too late -- e.g. objects already have been created without trap handler support.

Provided that we don't observe nginx: [error] [wasm] failed to enable v8 trap handler in the logs when running on x86_64 this issue seems to be explained by some misconfiguration during build.

[V8] release.yml is still failing

Opening an issue to track the problems we're still having with the release workflow:

  • Ubuntu 18.04 is failing because the V8-shipped pkgconfig can't find glib-2.0 โ€” build succeeded by fixing the linking order
  • Ubuntu 20.04 is failing because the V8-shipped pkgconfig can't find glib-2.0 โ€” build succeeded by fixing the linking order
  • CentOS 7 is failing because the V8-shipped pkgconfig can't find glib-2.0
  • CentOS 8 is failing because the V8-shipped pkgconfig can't find glib-2.0 โ€” build succeeded by fixing the linking order
  • ArchLinux is failing when linking v8bridge โ€” build succeeded by fixing the linking order
  • macOS is failing when building v8bridge ("<stdlib.h> not found"??) โ€” build succeeded by using the host compiler

Error logs

Snippets of the relevant messages from the error logs:

Ubuntu 18.04 (glib-2.0 pkg-config error)
2022-08-15T06:27:29.8201474Z generating V8 build files...
2022-08-15T06:27:30.2928533Z ################################################################################
2022-08-15T06:27:30.2929475Z /usr/bin/python3 -u tools/mb/mb.py gen -f infra/mb/mb_config.pyl -m developer_default -b x64.release.sample out.gn/x64.release.sample
2022-08-15T06:27:30.2929881Z   
2022-08-15T06:27:30.2931852Z   Writing """\
2022-08-15T06:27:30.2932539Z   dcheck_always_on = false
2022-08-15T06:27:30.2933015Z   is_component_build = false
2022-08-15T06:27:30.2933280Z   is_debug = false
2022-08-15T06:27:30.2933517Z   target_cpu = "x64"
2022-08-15T06:27:30.2933768Z   use_custom_libcxx = false
2022-08-15T06:27:30.2934006Z   v8_monolithic = true
2022-08-15T06:27:30.2934275Z   v8_use_external_startup_data = false
2022-08-15T06:27:30.2934668Z   """ to /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/args.gn.
2022-08-15T06:27:30.2934997Z   
2022-08-15T06:27:30.2935512Z   /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/buildtools/linux64/gn gen out.gn/x64.release.sample --check
2022-08-15T06:27:30.2935918Z     -> returned 1
2022-08-15T06:27:30.2936388Z   ERROR at //build/config/linux/pkg_config.gni:104:17: Script returned non-zero exit code.
2022-08-15T06:27:30.2937023Z       pkgresult = exec_script(pkg_config_script, args, "value")
2022-08-15T06:27:30.2937392Z                   ^----------
2022-08-15T06:27:30.2937784Z   Current dir: /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/
2022-08-15T06:27:30.2938845Z   Command: python3 /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py -s /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/linux/debian_bullseye_amd64-sysroot -a x64 glib-2.0 gmodule-2.0 gobject-2.0 gthread-2.0
2022-08-15T06:27:30.2939440Z   Returned 1.
2022-08-15T06:27:30.2939674Z   stderr:
2022-08-15T06:27:30.2939873Z   
2022-08-15T06:27:30.2940279Z   Package glib-2.0 was not found in the pkg-config search path.
2022-08-15T06:27:30.2940886Z   Perhaps you should add the directory containing `glib-2.0.pc'
2022-08-15T06:27:30.2941242Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:27:30.2941575Z   No package 'glib-2.0' found
2022-08-15T06:27:30.2942003Z   Package gmodule-2.0 was not found in the pkg-config search path.
2022-08-15T06:27:30.2942499Z   Perhaps you should add the directory containing `gmodule-2.0.pc'
2022-08-15T06:27:30.2942859Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:27:30.2943202Z   No package 'gmodule-2.0' found
2022-08-15T06:27:30.2943629Z   Package gobject-2.0 was not found in the pkg-config search path.
2022-08-15T06:27:30.2944106Z   Perhaps you should add the directory containing `gobject-2.0.pc'
2022-08-15T06:27:30.2944459Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:27:30.2944798Z   No package 'gobject-2.0' found
2022-08-15T06:27:30.2945219Z   Package gthread-2.0 was not found in the pkg-config search path.
2022-08-15T06:27:30.2945697Z   Perhaps you should add the directory containing `gthread-2.0.pc'
2022-08-15T06:27:30.2946047Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:27:30.2946380Z   No package 'gthread-2.0' found
2022-08-15T06:27:30.2946657Z   Traceback (most recent call last):
2022-08-15T06:27:30.2947615Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 248, in <module>
2022-08-15T06:27:30.2947944Z       sys.exit(main())
2022-08-15T06:27:30.2948408Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 143, in main
2022-08-15T06:27:30.2948783Z       prefix = GetPkgConfigPrefixToStrip(options, args)
2022-08-15T06:27:30.2949364Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 82, in GetPkgConfigPrefixToStrip
2022-08-15T06:27:30.2949860Z       "--variable=prefix"] + args, env=os.environ).decode('utf-8')
2022-08-15T06:27:30.2950195Z     File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:27:30.2950446Z       **kwargs).stdout
2022-08-15T06:27:30.2950709Z     File "/usr/lib/python3.6/subprocess.py", line 438, in run
2022-08-15T06:27:30.2950982Z       output=stdout, stderr=stderr)
2022-08-15T06:27:30.2951669Z   subprocess.CalledProcessError: Command '['pkg-config', '--variable=prefix', 'glib-2.0', 'gmodule-2.0', 'gobject-2.0', 'gthread-2.0']' returned non-zero exit status 1.
2022-08-15T06:27:30.2952047Z   
2022-08-15T06:27:30.2952292Z   See //build/config/linux/BUILD.gn:58:3: whence it was called.
2022-08-15T06:27:30.2952561Z     pkg_config("glib") {
2022-08-15T06:27:30.2952831Z     ^-------------------
2022-08-15T06:27:30.2953118Z   See //build/config/compiler/BUILD.gn:269:18: which caused the file to be included.
2022-08-15T06:27:30.2953447Z       configs += [ "//build/config/linux:compiler" ]
2022-08-15T06:27:30.2953784Z                    ^------------------------------
2022-08-15T06:27:30.2954016Z   GN gen failed: 1
2022-08-15T06:27:30.2975089Z Traceback (most recent call last):
2022-08-15T06:27:30.2975445Z   File "tools/dev/v8gen.py", line 309, in <module>
2022-08-15T06:27:30.2975756Z     sys.exit(gen.main())
2022-08-15T06:27:30.2976057Z   File "tools/dev/v8gen.py", line 303, in main
2022-08-15T06:27:30.2976383Z     return self._options.func()
2022-08-15T06:27:30.2976695Z   File "tools/dev/v8gen.py", line 169, in cmd_gen
2022-08-15T06:27:30.2976978Z     gn_outdir,
2022-08-15T06:27:30.2977266Z   File "tools/dev/v8gen.py", line 213, in _call_cmd
2022-08-15T06:27:30.2977568Z     stderr=subprocess.STDOUT,
2022-08-15T06:27:30.2977929Z   File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:27:30.2978254Z     **kwargs).stdout
2022-08-15T06:27:30.2978571Z   File "/usr/lib/python3.6/subprocess.py", line 438, in run
2022-08-15T06:27:30.2978887Z     output=stdout, stderr=stderr)
2022-08-15T06:27:30.2979957Z subprocess.CalledProcessError: Command '['/usr/bin/python3', '-u', 'tools/mb/mb.py', 'gen', '-f', 'infra/mb/mb_config.pyl', '-m', 'developer_default', '-b', 'x64.release.sample', 'out.gn/x64.release.sample']' returned non-zero exit status 1.
2022-08-15T06:27:30.4208666Z ##[error]Process completed with exit code 1.
Ubuntu 20.04 (glib-2.0 pkg-config error)
2022-08-15T06:26:13.4627940Z generating V8 build files...
2022-08-15T06:26:13.8476986Z ################################################################################
2022-08-15T06:26:13.8477903Z /usr/bin/python3 -u tools/mb/mb.py gen -f infra/mb/mb_config.pyl -m developer_default -b x64.release.sample out.gn/x64.release.sample
2022-08-15T06:26:13.8478268Z   
2022-08-15T06:26:13.8478457Z   Writing """\
2022-08-15T06:26:13.8478664Z   dcheck_always_on = false
2022-08-15T06:26:13.8478906Z   is_component_build = false
2022-08-15T06:26:13.8479133Z   is_debug = false
2022-08-15T06:26:13.8479365Z   target_cpu = "x64"
2022-08-15T06:26:13.8479576Z   use_custom_libcxx = false
2022-08-15T06:26:13.8479801Z   v8_monolithic = true
2022-08-15T06:26:13.8480247Z   v8_use_external_startup_data = false
2022-08-15T06:26:13.8480617Z   """ to /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/args.gn.
2022-08-15T06:26:13.8480902Z   
2022-08-15T06:26:13.8481391Z   /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/buildtools/linux64/gn gen out.gn/x64.release.sample --check
2022-08-15T06:26:13.8481767Z     -> returned 1
2022-08-15T06:26:13.8482185Z   ERROR at //build/config/linux/pkg_config.gni:104:17: Script returned non-zero exit code.
2022-08-15T06:26:13.8482537Z       pkgresult = exec_script(pkg_config_script, args, "value")
2022-08-15T06:26:13.8482857Z                   ^----------
2022-08-15T06:26:13.8483198Z   Current dir: /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/
2022-08-15T06:26:13.8487909Z   Command: python3 /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py -s /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/linux/debian_bullseye_amd64-sysroot -a x64 glib-2.0 gmodule-2.0 gobject-2.0 gthread-2.0
2022-08-15T06:26:13.8488441Z   Returned 1.
2022-08-15T06:26:13.8488629Z   stderr:
2022-08-15T06:26:13.8488813Z   
2022-08-15T06:26:13.8489168Z   Package glib-2.0 was not found in the pkg-config search path.
2022-08-15T06:26:13.8489587Z   Perhaps you should add the directory containing `glib-2.0.pc'
2022-08-15T06:26:13.8489911Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:26:13.8490230Z   No package 'glib-2.0' found
2022-08-15T06:26:13.8490615Z   Package gmodule-2.0 was not found in the pkg-config search path.
2022-08-15T06:26:13.8491045Z   Perhaps you should add the directory containing `gmodule-2.0.pc'
2022-08-15T06:26:13.8491372Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:26:13.8491687Z Traceback (most recent call last):
2022-08-15T06:26:13.8491968Z   File "tools/dev/v8gen.py", line 309, in <module>
2022-08-15T06:26:13.8492226Z     sys.exit(gen.main())
2022-08-15T06:26:13.8492465Z   File "tools/dev/v8gen.py", line 303, in main
2022-08-15T06:26:13.8493004Z     return self._options.func()
2022-08-15T06:26:13.8493282Z   File "tools/dev/v8gen.py", line 162, in cmd_gen
2022-08-15T06:26:13.8493530Z     self._call_cmd([
2022-08-15T06:26:13.8493764Z   File "tools/dev/v8gen.py", line 211, in _call_cmd
2022-08-15T06:26:13.8494042Z     output = subprocess.check_output(
2022-08-15T06:26:13.8494356Z   File "/usr/lib/python3.8/subprocess.py", line 415, in check_output
2022-08-15T06:26:13.8494715Z     return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
2022-08-15T06:26:13.8495040Z   File "/usr/lib/python3.8/subprocess.py", line 516, in run
2022-08-15T06:26:13.8495362Z     raise CalledProcessError(retcode, process.args,
2022-08-15T06:26:13.8496169Z subprocess.CalledProcessError: Command '['/usr/bin/python3', '-u', 'tools/mb/mb.py', 'gen', '-f', 'infra/mb/mb_config.pyl', '-m', 'developer_default', '-b', 'x64.release.sample', 'out.gn/x64.release.sample']' returned non-zero exit status 1.
2022-08-15T06:26:13.8496857Z   No package 'gmodule-2.0' found
2022-08-15T06:26:13.8497252Z   Package gobject-2.0 was not found in the pkg-config search path.
2022-08-15T06:26:13.8497686Z   Perhaps you should add the directory containing `gobject-2.0.pc'
2022-08-15T06:26:13.8498017Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:26:13.8498342Z   No package 'gobject-2.0' found
2022-08-15T06:26:13.8498733Z   Package gthread-2.0 was not found in the pkg-config search path.
2022-08-15T06:26:13.8499161Z   Perhaps you should add the directory containing `gthread-2.0.pc'
2022-08-15T06:26:13.8499488Z   to the PKG_CONFIG_PATH environment variable
2022-08-15T06:26:13.8499815Z   No package 'gthread-2.0' found
2022-08-15T06:26:13.8500068Z   Traceback (most recent call last):
2022-08-15T06:26:13.8500567Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 248, in <module>
2022-08-15T06:26:13.8500905Z       sys.exit(main())
2022-08-15T06:26:13.8501385Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 143, in main
2022-08-15T06:26:13.8501789Z       prefix = GetPkgConfigPrefixToStrip(options, args)
2022-08-15T06:26:13.8502384Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 81, in GetPkgConfigPrefixToStrip
2022-08-15T06:26:13.8502819Z       prefix = subprocess.check_output([options.pkg_config,
2022-08-15T06:26:13.8503161Z     File "/usr/lib/python3.8/subprocess.py", line 415, in check_output
2022-08-15T06:26:13.8503511Z       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
2022-08-15T06:26:13.8503850Z     File "/usr/lib/python3.8/subprocess.py", line 516, in run
2022-08-15T06:26:13.8504162Z       raise CalledProcessError(retcode, process.args,
2022-08-15T06:26:13.8504828Z   subprocess.CalledProcessError: Command '['pkg-config', '--variable=prefix', 'glib-2.0', 'gmodule-2.0', 'gobject-2.0', 'gthread-2.0']' returned non-zero exit status 1.
2022-08-15T06:26:13.8505233Z   
2022-08-15T06:26:13.8505490Z   See //build/config/linux/BUILD.gn:58:3: whence it was called.
2022-08-15T06:26:13.8505748Z     pkg_config("glib") {
2022-08-15T06:26:13.8506018Z     ^-------------------
2022-08-15T06:26:13.8506327Z   See //build/config/compiler/BUILD.gn:269:18: which caused the file to be included.
2022-08-15T06:26:13.8506663Z       configs += [ "//build/config/linux:compiler" ]
2022-08-15T06:26:13.8506994Z                    ^------------------------------
2022-08-15T06:26:13.8507229Z   GN gen failed: 1
2022-08-15T06:26:13.8788158Z ##[error]Process completed with exit code 1.
CentOS 7 (glib-2.0 pkg-config error)
2022-08-15T06:27:38.8093028Z generating V8 build files...
2022-08-15T06:27:39.3008376Z ################################################################################
2022-08-15T06:27:39.3009277Z /usr/bin/python3 -u tools/mb/mb.py gen -f infra/mb/mb_config.pyl -m developer_default -b x64.release.sample out.gn/x64.release.sample
2022-08-15T06:27:39.3009658Z   
2022-08-15T06:27:39.3009833Z   Writing """\
2022-08-15T06:27:39.3010052Z   dcheck_always_on = false
2022-08-15T06:27:39.3010291Z   is_component_build = false
2022-08-15T06:27:39.3010515Z   is_debug = false
2022-08-15T06:27:39.3010711Z   target_cpu = "x64"
2022-08-15T06:27:39.3010939Z   use_custom_libcxx = false
2022-08-15T06:27:39.3011477Z   v8_monolithic = true
2022-08-15T06:27:39.3011727Z   v8_use_external_startup_data = false
2022-08-15T06:27:39.3012069Z   """ to /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/args.gn.
2022-08-15T06:27:39.3012364Z   
2022-08-15T06:27:39.3012843Z   /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/buildtools/linux64/gn gen out.gn/x64.release.sample --check
2022-08-15T06:27:39.3013216Z     -> returned 1
2022-08-15T06:27:39.3013628Z   ERROR at //build/config/linux/pkg_config.gni:104:17: Script returned non-zero exit code.
2022-08-15T06:27:39.3013996Z       pkgresult = exec_script(pkg_config_script, args, "value")
2022-08-15T06:27:39.3014323Z                   ^----------
2022-08-15T06:27:39.3014659Z   Current dir: /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/
2022-08-15T06:27:39.3015532Z   Command: python3 /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py -s /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/linux/debian_bullseye_amd64-sysroot -a x64 glib-2.0 gmodule-2.0 gobject-2.0 gthread-2.0
2022-08-15T06:27:39.3019027Z   Returned 1.
2022-08-15T06:27:39.3019223Z   stderr:
2022-08-15T06:27:39.3019405Z   
2022-08-15T06:27:39.3019619Z   Traceback (most recent call last):
2022-08-15T06:27:39.3020139Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 248, in <module>
2022-08-15T06:27:39.3020477Z       sys.exit(main())
2022-08-15T06:27:39.3020951Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 143, in main
2022-08-15T06:27:39.3021361Z       prefix = GetPkgConfigPrefixToStrip(options, args)
2022-08-15T06:27:39.3021961Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 82, in GetPkgConfigPrefixToStrip
2022-08-15T06:27:39.3022473Z       "--variable=prefix"] + args, env=os.environ).decode('utf-8')
2022-08-15T06:27:39.3022824Z     File "/usr/lib64/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:27:39.3023102Z       **kwargs).stdout
2022-08-15T06:27:39.3023375Z     File "/usr/lib64/python3.6/subprocess.py", line 438, in run
2022-08-15T06:27:39.3023646Z       output=stdout, stderr=stderr)
2022-08-15T06:27:39.3024282Z   subprocess.CalledProcessError: Command '['pkg-config', '--variable=prefix', 'glib-2.0', 'gmodule-2.0', 'gobject-2.0', 'gthread-2.0']' returned non-zero exit status 1.
2022-08-15T06:27:39.3024685Z   
2022-08-15T06:27:39.3024939Z   See //build/config/linux/BUILD.gn:58:3: whence it was called.
2022-08-15T06:27:39.3025341Z     pkg_config("glib") {
2022-08-15T06:27:39.3025615Z     ^-------------------
2022-08-15T06:27:39.3025964Z   See //build/config/compiler/BUILD.gn:269:18: which caused the file to be included.
2022-08-15T06:27:39.3026302Z       configs += [ "//build/config/linux:compiler" ]
2022-08-15T06:27:39.3026635Z                    ^------------------------------
2022-08-15T06:27:39.3026875Z   GN gen failed: 1
2022-08-15T06:27:39.3027145Z Traceback (most recent call last):
2022-08-15T06:27:39.3027419Z   File "tools/dev/v8gen.py", line 309, in <module>
2022-08-15T06:27:39.3027670Z     sys.exit(gen.main())
2022-08-15T06:27:39.3027905Z   File "tools/dev/v8gen.py", line 303, in main
2022-08-15T06:27:39.3028167Z     return self._options.func()
2022-08-15T06:27:39.3028436Z   File "tools/dev/v8gen.py", line 169, in cmd_gen
2022-08-15T06:27:39.3028675Z     gn_outdir,
2022-08-15T06:27:39.3028904Z   File "tools/dev/v8gen.py", line 213, in _call_cmd
2022-08-15T06:27:39.3029169Z     stderr=subprocess.STDOUT,
2022-08-15T06:27:39.3029478Z   File "/usr/lib64/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:27:39.3029740Z     **kwargs).stdout
2022-08-15T06:27:39.3030009Z   File "/usr/lib64/python3.6/subprocess.py", line 438, in run
2022-08-15T06:27:39.3030291Z     output=stdout, stderr=stderr)
2022-08-15T06:27:39.3031138Z subprocess.CalledProcessError: Command '['/usr/bin/python3', '-u', 'tools/mb/mb.py', 'gen', '-f', 'infra/mb/mb_config.pyl', '-m', 'developer_default', '-b', 'x64.release.sample', 'out.gn/x64.release.sample']' returned non-zero exit status 1.
2022-08-15T06:27:39.3244283Z ##[error]Process completed with exit code 1.
CentOS 8 (glib-2.0 pkg-config error)
2022-08-15T06:32:30.9009216Z generating V8 build files...
2022-08-15T06:32:31.5054656Z ################################################################################
2022-08-15T06:32:31.5055664Z /usr/bin/python3 -u tools/mb/mb.py gen -f infra/mb/mb_config.pyl -m developer_default -b x64.release.sample out.gn/x64.release.sample
2022-08-15T06:32:31.5071137Z Traceback (most recent call last):
2022-08-15T06:32:31.5074151Z   
2022-08-15T06:32:31.5074380Z   Writing """\
2022-08-15T06:32:31.5074645Z   dcheck_always_on = false
2022-08-15T06:32:31.5074932Z   is_component_build = false
2022-08-15T06:32:31.5075199Z   is_debug = false
2022-08-15T06:32:31.5075462Z   target_cpu = "x64"
2022-08-15T06:32:31.5075714Z   use_custom_libcxx = false
2022-08-15T06:32:31.5075982Z   v8_monolithic = true
2022-08-15T06:32:31.5076263Z   v8_use_external_startup_data = false
2022-08-15T06:32:31.5076655Z   """ to /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/args.gn.
2022-08-15T06:32:31.5076999Z   
2022-08-15T06:32:31.5078605Z   /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/buildtools/linux64/gn gen out.gn/x64.release.sample --check
2022-08-15T06:32:31.5079103Z     -> returned 1
2022-08-15T06:32:31.5079602Z   ERROR at //build/config/linux/pkg_config.gni:104:17: Script returned non-zero exit code.
2022-08-15T06:32:31.5080011Z       pkgresult = exec_script(pkg_config_script, args, "value")
2022-08-15T06:32:31.5080381Z                   ^----------
2022-08-15T06:32:31.5080775Z   Current dir: /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/out.gn/x64.release.sample/
2022-08-15T06:32:31.5082398Z   Command: python3 /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py -s /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/linux/debian_bullseye_amd64-sysroot -a x64 glib-2.0 gmodule-2.0 gobject-2.0 gthread-2.0
2022-08-15T06:32:31.5083018Z   Returned 1.
2022-08-15T06:32:31.5083231Z   stderr:
2022-08-15T06:32:31.5083442Z   
2022-08-15T06:32:31.5083687Z   Traceback (most recent call last):
2022-08-15T06:32:31.5084284Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 248, in <module>
2022-08-15T06:32:31.5084674Z       sys.exit(main())
2022-08-15T06:32:31.5085216Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 143, in main
2022-08-15T06:32:31.5085681Z       prefix = GetPkgConfigPrefixToStrip(options, args)
2022-08-15T06:32:31.5086560Z     File "/__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/build/config/linux/pkg-config.py", line 82, in GetPkgConfigPrefixToStrip
2022-08-15T06:32:31.5087143Z       "--variable=prefix"] + args, env=os.environ).decode('utf-8')
2022-08-15T06:32:31.5087546Z     File "/usr/lib64/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:32:31.5087868Z       **kwargs).stdout
2022-08-15T06:32:31.5088240Z     File "/usr/lib64/python3.6/subprocess.py", line 438, in run
2022-08-15T06:32:31.5088553Z       output=stdout, stderr=stderr)
2022-08-15T06:32:31.5089283Z   subprocess.CalledProcessError: Command '['pkg-config', '--variable=prefix', 'glib-2.0', 'gmodule-2.0', 'gobject-2.0', 'gthread-2.0']' returned non-zero exit status 1.
2022-08-15T06:32:31.5089741Z   
2022-08-15T06:32:31.5090036Z   See //build/config/linux/BUILD.gn:58:3: whence it was called.
2022-08-15T06:32:31.5090329Z     pkg_config("glib") {
2022-08-15T06:32:31.5090641Z     ^-------------------
2022-08-15T06:32:31.5091003Z   See //build/config/compiler/BUILD.gn:269:18: which caused the file to be included.
2022-08-15T06:32:31.5091394Z       configs += [ "//build/config/linux:compiler" ]
2022-08-15T06:32:31.5091778Z                    ^------------------------------
2022-08-15T06:32:31.5092052Z   GN gen failed: 1
2022-08-15T06:32:31.5092372Z   File "tools/dev/v8gen.py", line 309, in <module>
2022-08-15T06:32:31.5092665Z     sys.exit(gen.main())
2022-08-15T06:32:31.5092955Z   File "tools/dev/v8gen.py", line 303, in main
2022-08-15T06:32:31.5093242Z     return self._options.func()
2022-08-15T06:32:31.5093552Z   File "tools/dev/v8gen.py", line 169, in cmd_gen
2022-08-15T06:32:31.5093828Z     gn_outdir,
2022-08-15T06:32:31.5094090Z   File "tools/dev/v8gen.py", line 213, in _call_cmd
2022-08-15T06:32:31.5094397Z     stderr=subprocess.STDOUT,
2022-08-15T06:32:31.5094749Z   File "/usr/lib64/python3.6/subprocess.py", line 356, in check_output
2022-08-15T06:32:31.5095063Z     **kwargs).stdout
2022-08-15T06:32:31.5095362Z   File "/usr/lib64/python3.6/subprocess.py", line 438, in run
2022-08-15T06:32:31.5095690Z     output=stdout, stderr=stderr)
2022-08-15T06:32:31.5096563Z subprocess.CalledProcessError: Command '['/usr/bin/python3', '-u', 'tools/mb/mb.py', 'gen', '-f', 'infra/mb/mb_config.pyl', '-m', 'developer_default', '-b', 'x64.release.sample', 'out.gn/x64.release.sample']' returned non-zero exit status 1.
2022-08-15T06:32:31.5366822Z ##[error]Process completed with exit code 1.
ArchLinux (v8bridge compilation)
2022-08-15T07:19:42.7015102Z make: Entering directory '/__w/ngx_wasm_module/ngx_wasm_module/lib/v8bridge'
2022-08-15T07:19:42.7016058Z /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/clang -I /__w/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/include -std=c++17 -O3   -c -o bridge.o bridge.cc
2022-08-15T07:19:43.1663609Z ar rcs libv8bridge.a bridge.o
2022-08-15T07:19:43.2091188Z make: Leaving directory '/__w/ngx_wasm_module/ngx_wasm_module/lib/v8bridge'
2022-08-15T07:19:43.2148217Z make: Entering directory '/__w/ngx_wasm_module/ngx_wasm_module/lib/v8bridge'
2022-08-15T07:19:43.2148907Z cp ./libv8bridge.a /__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libv8bridge.a
2022-08-15T07:19:43.2175850Z cp ./bridge.h /__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/include/v8bridge.h
2022-08-15T07:19:43.2203840Z make: Leaving directory '/__w/ngx_wasm_module/ngx_wasm_module/lib/v8bridge'
2022-08-15T07:19:43.2208067Z caching built assets in /__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux...
2022-08-15T07:19:43.2263324Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/include/wasm.h' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/wasm.h'
2022-08-15T07:19:43.2946350Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libwee8.a' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/libwee8.a'
2022-08-15T07:19:43.2982742Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/include/cwabt.h' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/cwabt.h'
2022-08-15T07:19:43.4403489Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libcwabt.a' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/libcwabt.a'
2022-08-15T07:19:43.4440245Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/include/v8bridge.h' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/v8bridge.h'
2022-08-15T07:19:43.4467477Z '/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libv8bridge.a' -> '/__w/ngx_wasm_module/ngx_wasm_module/work/downloads/v8-10.5.18-archlinux/libv8bridge.a'
ArchLinux (link error)
2022-08-15T07:22:18.5414128Z clang -o /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/nginx \
2022-08-15T07:22:18.5416690Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/nginx.o \
2022-08-15T07:22:18.5417722Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_log.o \
2022-08-15T07:22:18.5418922Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_palloc.o \
2022-08-15T07:22:18.5419797Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_array.o \
2022-08-15T07:22:18.5420624Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_list.o \
2022-08-15T07:22:18.5421461Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_hash.o \
2022-08-15T07:22:18.5422331Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_buf.o \
2022-08-15T07:22:18.5423460Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_queue.o \
2022-08-15T07:22:18.5424376Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_output_chain.o \
2022-08-15T07:22:18.5425242Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_string.o \
2022-08-15T07:22:18.5426073Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_parse.o \
2022-08-15T07:22:18.5426927Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_parse_time.o \
2022-08-15T07:22:18.5427774Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_inet.o \
2022-08-15T07:22:18.5428885Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_file.o \
2022-08-15T07:22:18.5429992Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_crc32.o \
2022-08-15T07:22:18.5430841Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_murmurhash.o \
2022-08-15T07:22:18.5431688Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_md5.o \
2022-08-15T07:22:18.5432515Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_sha1.o \
2022-08-15T07:22:18.5433529Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_rbtree.o \
2022-08-15T07:22:18.5434376Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_radix_tree.o \
2022-08-15T07:22:18.5435221Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_slab.o \
2022-08-15T07:22:18.5436049Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_times.o \
2022-08-15T07:22:18.5436879Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_shmtx.o \
2022-08-15T07:22:18.5437727Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_connection.o \
2022-08-15T07:22:18.5438575Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_cycle.o \
2022-08-15T07:22:18.5439429Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_spinlock.o \
2022-08-15T07:22:18.5440335Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_rwlock.o \
2022-08-15T07:22:18.5441190Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_cpuinfo.o \
2022-08-15T07:22:18.5442045Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_conf_file.o \
2022-08-15T07:22:18.5442896Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_module.o \
2022-08-15T07:22:18.5444192Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_resolver.o \
2022-08-15T07:22:18.5445397Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_open_file_cache.o \
2022-08-15T07:22:18.5446250Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_crypt.o \
2022-08-15T07:22:18.5447121Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_proxy_protocol.o \
2022-08-15T07:22:18.5447983Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_syslog.o \
2022-08-15T07:22:18.5448824Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event.o \
2022-08-15T07:22:18.5449682Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_timer.o \
2022-08-15T07:22:18.5450815Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_posted.o \
2022-08-15T07:22:18.5451691Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_accept.o \
2022-08-15T07:22:18.5452567Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_udp.o \
2022-08-15T07:22:18.5453435Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_connect.o \
2022-08-15T07:22:18.5454466Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_pipe.o \
2022-08-15T07:22:18.5455486Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_time.o \
2022-08-15T07:22:18.5456319Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_errno.o \
2022-08-15T07:22:18.5457163Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_alloc.o \
2022-08-15T07:22:18.5458273Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_files.o \
2022-08-15T07:22:18.5459129Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_socket.o \
2022-08-15T07:22:18.5459973Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_recv.o \
2022-08-15T07:22:18.5460836Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_readv_chain.o \
2022-08-15T07:22:18.5461679Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_udp_recv.o \
2022-08-15T07:22:18.5462636Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_send.o \
2022-08-15T07:22:18.5463508Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_writev_chain.o \
2022-08-15T07:22:18.5464366Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_udp_send.o \
2022-08-15T07:22:18.5465253Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_udp_sendmsg_chain.o \
2022-08-15T07:22:18.5466132Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_channel.o \
2022-08-15T07:22:18.5466974Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_shmem.o \
2022-08-15T07:22:18.5467831Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_process.o \
2022-08-15T07:22:18.5468680Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_daemon.o \
2022-08-15T07:22:18.5469575Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_setaffinity.o \
2022-08-15T07:22:18.5470457Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_setproctitle.o \
2022-08-15T07:22:18.5471316Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_posix_init.o \
2022-08-15T07:22:18.5472176Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_user.o \
2022-08-15T07:22:18.5473023Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_dlopen.o \
2022-08-15T07:22:18.5473908Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_process_cycle.o \
2022-08-15T07:22:18.5474776Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_linux_init.o \
2022-08-15T07:22:18.5475668Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/modules/ngx_epoll_module.o \
2022-08-15T07:22:18.5476638Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_linux_sendfile_chain.o \
2022-08-15T07:22:18.5477520Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_thread_pool.o \
2022-08-15T07:22:18.5478383Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_thread_cond.o \
2022-08-15T07:22:18.5479261Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_thread_mutex.o \
2022-08-15T07:22:18.5480127Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/os/unix/ngx_thread_id.o \
2022-08-15T07:22:18.5481008Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_openssl.o \
2022-08-15T07:22:18.5481895Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/event/ngx_event_openssl_stapling.o \
2022-08-15T07:22:18.5482832Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/core/ngx_regex.o \
2022-08-15T07:22:18.5483686Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http.o \
2022-08-15T07:22:18.5484791Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_core_module.o \
2022-08-15T07:22:18.5485741Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_special_response.o \
2022-08-15T07:22:18.5486637Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_request.o \
2022-08-15T07:22:18.5487492Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_parse.o \
2022-08-15T07:22:18.5488849Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_log_module.o \
2022-08-15T07:22:18.5489854Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_request_body.o \
2022-08-15T07:22:18.5490740Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_variables.o \
2022-08-15T07:22:18.5491833Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_script.o \
2022-08-15T07:22:18.5492758Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_upstream.o \
2022-08-15T07:22:18.5493669Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_upstream_round_robin.o \
2022-08-15T07:22:18.5494756Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_file_cache.o \
2022-08-15T07:22:18.5495644Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_huff_decode.o \
2022-08-15T07:22:18.5496531Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_huff_encode.o \
2022-08-15T07:22:18.5497616Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_write_filter_module.o \
2022-08-15T07:22:18.5498522Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_header_filter_module.o \
2022-08-15T07:22:18.5499478Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_chunked_filter_module.o \
2022-08-15T07:22:18.5500413Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/v2/ngx_http_v2_filter_module.o \
2022-08-15T07:22:18.5501344Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_range_filter_module.o \
2022-08-15T07:22:18.5502285Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_gzip_filter_module.o \
2022-08-15T07:22:18.5503226Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_postpone_filter_module.o \
2022-08-15T07:22:18.5504263Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_ssi_filter_module.o \
2022-08-15T07:22:18.5505200Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_charset_filter_module.o \
2022-08-15T07:22:18.5506151Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_sub_filter_module.o \
2022-08-15T07:22:18.5507101Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_addition_filter_module.o \
2022-08-15T07:22:18.5508068Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_gunzip_filter_module.o \
2022-08-15T07:22:18.5509030Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_userid_filter_module.o \
2022-08-15T07:22:18.5509984Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_headers_filter_module.o \
2022-08-15T07:22:18.5510900Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/ngx_http_copy_filter_module.o \
2022-08-15T07:22:18.5511851Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_not_modified_filter_module.o \
2022-08-15T07:22:18.5512761Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/v2/ngx_http_v2.o \
2022-08-15T07:22:18.5513641Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/v2/ngx_http_v2_table.o \
2022-08-15T07:22:18.5514530Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/v2/ngx_http_v2_encode.o \
2022-08-15T07:22:18.5515421Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/v2/ngx_http_v2_module.o \
2022-08-15T07:22:18.5516354Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_static_module.o \
2022-08-15T07:22:18.5517290Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_gzip_static_module.o \
2022-08-15T07:22:18.5518299Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_dav_module.o \
2022-08-15T07:22:18.5519521Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_autoindex_module.o \
2022-08-15T07:22:18.5520451Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_index_module.o \
2022-08-15T07:22:18.5521381Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_random_index_module.o \
2022-08-15T07:22:18.5522300Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_mirror_module.o \
2022-08-15T07:22:18.5523228Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_try_files_module.o \
2022-08-15T07:22:18.5524642Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_auth_request_module.o \
2022-08-15T07:22:18.5526246Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_auth_basic_module.o \
2022-08-15T07:22:18.5527232Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_access_module.o \
2022-08-15T07:22:18.5528164Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_limit_conn_module.o \
2022-08-15T07:22:18.5529090Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_limit_req_module.o \
2022-08-15T07:22:18.5530015Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_realip_module.o \
2022-08-15T07:22:18.5530932Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_geo_module.o \
2022-08-15T07:22:18.5531841Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_map_module.o \
2022-08-15T07:22:18.5532781Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_split_clients_module.o \
2022-08-15T07:22:18.5533716Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_referer_module.o \
2022-08-15T07:22:18.5534849Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_rewrite_module.o \
2022-08-15T07:22:18.5535772Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_ssl_module.o \
2022-08-15T07:22:18.5536694Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_proxy_module.o \
2022-08-15T07:22:18.5537611Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_grpc_module.o \
2022-08-15T07:22:18.5538539Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_memcached_module.o \
2022-08-15T07:22:18.5539590Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_empty_gif_module.o \
2022-08-15T07:22:18.5540499Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_browser_module.o \
2022-08-15T07:22:18.5541435Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_secure_link_module.o \
2022-08-15T07:22:18.5542351Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_flv_module.o \
2022-08-15T07:22:18.5543254Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_mp4_module.o \
2022-08-15T07:22:18.5544182Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_hash_module.o \
2022-08-15T07:22:18.5545146Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_ip_hash_module.o \
2022-08-15T07:22:18.5546186Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_least_conn_module.o \
2022-08-15T07:22:18.5547169Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_random_module.o \
2022-08-15T07:22:18.5548143Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_keepalive_module.o \
2022-08-15T07:22:18.5549105Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_upstream_zone_module.o \
2022-08-15T07:22:18.5550056Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/http/modules/ngx_http_stub_status_module.o \
2022-08-15T07:22:18.5550952Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream.o \
2022-08-15T07:22:18.5552136Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_variables.o \
2022-08-15T07:22:18.5553020Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_script.o \
2022-08-15T07:22:18.5553902Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_handler.o \
2022-08-15T07:22:18.5554794Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_core_module.o \
2022-08-15T07:22:18.5555693Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_log_module.o \
2022-08-15T07:22:18.5556595Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_proxy_module.o \
2022-08-15T07:22:18.5557477Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream.o \
2022-08-15T07:22:18.5558612Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream_round_robin.o \
2022-08-15T07:22:18.5559597Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_write_filter_module.o \
2022-08-15T07:22:18.5560643Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_ssl_module.o \
2022-08-15T07:22:18.5561551Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_limit_conn_module.o \
2022-08-15T07:22:18.5562494Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_access_module.o \
2022-08-15T07:22:18.5563385Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_geo_module.o \
2022-08-15T07:22:18.5564278Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_map_module.o \
2022-08-15T07:22:18.5565198Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_split_clients_module.o \
2022-08-15T07:22:18.5566115Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_return_module.o \
2022-08-15T07:22:18.5567081Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_set_module.o \
2022-08-15T07:22:18.5568006Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream_hash_module.o \
2022-08-15T07:22:18.5568950Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream_least_conn_module.o \
2022-08-15T07:22:18.5569897Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream_random_module.o \
2022-08-15T07:22:18.5570827Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_upstream_zone_module.o \
2022-08-15T07:22:18.5571753Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/src/stream/ngx_stream_ssl_preread_module.o \
2022-08-15T07:22:18.5590696Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasm/ngx_wasm.o \
2022-08-15T07:22:18.5591634Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasm/ngx_wasm_ops.o \
2022-08-15T07:22:18.5592497Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasm/ngx_wasm_util.o \
2022-08-15T07:22:18.5593358Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/vm/ngx_wavm.o \
2022-08-15T07:22:18.5594213Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/vm/ngx_wavm_host.o \
2022-08-15T07:22:18.5595100Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasi/ngx_wasi_host.o \
2022-08-15T07:22:18.5595996Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/common/ngx_wasm_socket_tcp.o \
2022-08-15T07:22:18.5596924Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/common/ngx_wasm_socket_tcp_readers.o \
2022-08-15T07:22:18.5597828Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_proxy_wasm.o \
2022-08-15T07:22:18.5598981Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_proxy_wasm_host.o \
2022-08-15T07:22:18.5599894Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_proxy_wasm_maps.o \
2022-08-15T07:22:18.5600832Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_proxy_wasm_properties.o \
2022-08-15T07:22:18.5601746Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_proxy_wasm_util.o \
2022-08-15T07:22:18.5602613Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wrt/ngx_wrt_v8.o \
2022-08-15T07:22:18.5603475Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasm/ngx_wasm_core_module.o \
2022-08-15T07:22:18.5604357Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/wasm/ngx_wasm_core_host.o \
2022-08-15T07:22:18.5605315Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_module.o \
2022-08-15T07:22:18.5606237Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_directives.o \
2022-08-15T07:22:18.5607153Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_local_response.o \
2022-08-15T07:22:18.5608066Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_headers.o \
2022-08-15T07:22:18.5608974Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_headers_request.o \
2022-08-15T07:22:18.5609887Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_headers_response.o \
2022-08-15T07:22:18.5610816Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_headers_shims.o \
2022-08-15T07:22:18.5611713Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_host.o \
2022-08-15T07:22:18.5612603Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_util.o \
2022-08-15T07:22:18.5613492Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_escape.o \
2022-08-15T07:22:18.5614513Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_http_proxy_wasm.o \
2022-08-15T07:22:18.5615459Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/proxy_wasm/ngx_http_proxy_wasm_dispatch.o \
2022-08-15T07:22:18.5616398Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/http/ngx_http_wasm_filter_module.o \
2022-08-15T07:22:18.5617302Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/addon/stream/ngx_stream_wasm_module.o \
2022-08-15T07:22:18.5618161Z /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/ngx_modules.o \
2022-08-15T07:22:18.5620508Z -lm /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/wasmer-2.3.0/lib/libwasmer.a -ldl -lpthread /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/wasmtime-0.38.1/lib/libwasmtime.a -ldl -lpthread -lpthread -lcrypt -L/__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib -lwee8 -lcwabt -lm -lstdc++ -ldl -lpthread -lv8bridge /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/pcre-8.45/.libs/libpcre.a /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/openssl-1.1.1q/.openssl/lib/libssl.a /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/openssl-1.1.1q/.openssl/lib/libcrypto.a -lpthread /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/zlib-1.2.12/libz.a \
2022-08-15T07:22:18.5621862Z -Wl,-E
2022-08-15T07:22:21.1560983Z /usr/sbin/ld: /__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libv8bridge.a(bridge.o): in function `ngx_v8_enable_wasm_trap_handler':
2022-08-15T07:22:21.1577550Z bridge.cc:(.text+0xb): undefined reference to `v8::V8::EnableWebAssemblyTrapHandler(bool)'
2022-08-15T07:22:21.1578494Z /usr/sbin/ld: /__w/ngx_wasm_module/ngx_wasm_module/work/v8-10.5.18-archlinux/lib/libv8bridge.a(bridge.o): in function `ngx_v8_set_flags':
2022-08-15T07:22:21.1579061Z bridge.cc:(.text+0x21): undefined reference to `v8::V8::SetFlagsFromString(char const*)'
2022-08-15T07:22:21.3619132Z clang-14: error: linker command failed with exit code 1 (use -v to see invocation)
2022-08-15T07:22:21.3658470Z make[1]: *** [/__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/Makefile:366: /__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/build/build-wasmx-nightly-20220815-v8-amd64-archlinux/nginx] Error 1
2022-08-15T07:22:21.3665614Z make[1]: Leaving directory '/__w/ngx_wasm_module/ngx_wasm_module/work/ngx_wasm_module_dist/nginx-1.23.1'
2022-08-15T07:22:21.3835505Z make: *** [Makefile:10: build] Error 2
2022-08-15T07:22:21.4512625Z ##[error]Process completed with exit code 2.
macOS (v8bridge compilation)
2022-08-15T07:37:08.3119910Z building v8bridge...
2022-08-15T07:37:08.3775540Z /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/clang -I /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/include -std=c++17 -O3   -c -o bridge.o bridge.cc
2022-08-15T07:37:08.5032990Z In file included from bridge.cc:1:
2022-08-15T07:37:08.5033420Z In file included from /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/include/v8.h:21:
2022-08-15T07:37:08.5035510Z In file included from /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/../include/c++/v1/memory:846:
2022-08-15T07:37:08.5036290Z In file included from /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/../include/c++/v1/__memory/allocator.h:18:
2022-08-15T07:37:08.5037020Z In file included from /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/../include/c++/v1/new:93:
2022-08-15T07:37:08.5037720Z In file included from /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/../include/c++/v1/cstdlib:86:
2022-08-15T07:37:08.5038460Z /Users/runner/work/ngx_wasm_module/ngx_wasm_module/work/libwee8/repos/v8/third_party/llvm-build/Release+Asserts/bin/../include/c++/v1/stdlib.h:93:15: fatal error: 'stdlib.h' file not found
2022-08-15T07:37:08.5038840Z #include_next <stdlib.h>
2022-08-15T07:37:08.5039030Z               ^~~~~~~~~~
2022-08-15T07:37:08.8967370Z 1 error generated.
2022-08-15T07:37:08.8997120Z make: *** [bridge.o] Error 1

Check use of NGX_RPATH in auto/runtime

When grepping for NGX_RPATH I noticed that everywhere in the Nginx codebase it uses YES and NO (in uppercase) as possible values.

Our code in auto/runtime checks for if [ $NGX_RPATH = yes ] here.

I also noticed that the code that does sed 's/-L/-Wl,-rpath,/g' is in the else block (I also don't know what -R is in sed 's/-L/-R,/g'). Also, since things look like they're working, instead of just sending a PR changing yes to YES which would reverse the test's logic, I've thought I'd better raise the issue so we can check what the correct form should be for this test.

Original commit adding the NGX_RPATH test for more context: 1dbba09

Parallel dispatch calls freeze when Lua resolver yields

Problem summary: If you dispatch two parallel HTTP calls, and both call into Lua for DNS resolution, and both yield to wait for the DNS response, only the second one gets a response. The first one waits forever.

Background and some observations of the behavior

This has been observed first with Kong Gateway, but it is also reproducible in this module's test suite (see test cases in PR #523).

Using two different hostnames allows us to see in the logs that the second DNS request is the one that gets a response.

Yielding affects the behavior:

  • if you issue a request that triggers two parallel calls for the same hostname, it hangs because the first DNS request yields and never resumes. If you Ctrl-C the request and try again, the overall request succeeds, because the hostname was cached from the successful call of the previous request, and both succeed.
  • if you try to do the same with a request that triggers two parallel calls for the different hostnames, the first request gets the same behavior (it hangs because the first DNS request yields and never resumes), but the second request hangs again, because it will try the first hostname which never resolved, yields, then tries the second hostname which is cached, and succeeds, but the overall request is still waiting for the first hostname.

In short, it seems that triggering the Lua resolver a second time discards the pending DNS resolution coroutine.

Logs

Enabling "debug_mail" logs on the Kong Gateway showed some interesting bits which may provide some clues. Here's what I've observed:

First we create a new Lua thread to resolve gobolinux.org, all is normal so far:

2024/03/22 16:25:45 [debug] 90985#0: *2 lua creating new thread
2024/03/22 16:25:45 [debug] 90985#0: lua ref lua thread 00005A6A986C62C0 (ref 649)
2024/03/22 16:25:45 [debug] 90985#0: *2 code cache lookup (key='wasm_lua_resolver_chunkdd7007ea3c1caf68390921fc5ddb939c', ref=0)
2024/03/22 16:25:45 [debug] 90985#0: *2 code cache miss (key='wasm_lua_resolver_chunkdd7007ea3c1caf68390921fc5ddb939c', ref=0)
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm running lua thread (lctx: 00005A6A96928E80, L: 00005A6A95FFCD30, co: 00005A6A986C62C0)
2024/03/22 16:25:45 [debug] 90985#0: *2 lua reset ctx
2024/03/22 16:25:45 [debug] 90985#0: *2 http lua finalize threads
2024/03/22 16:25:45 [debug] 90985#0: *2 lua run thread, top:0 c:1
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:13: wasm lua resolver thread
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:20: wasm lua resolver using existing dns_client
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:42: wasm lua resolving "gobolinux.org"

This causes the DNS client to trigger the UDP message for DNS resolution, epoll fd 34. It does not get an immediate response, so it yields:

2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket network address given directly
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket resolve retval handler
2024/03/22 16:25:45 [debug] 90985#0: *2 UDP socket 34
2024/03/22 16:25:45 [debug] 90985#0: *2 connect to 127.0.0.53:53, fd:34 #646
2024/03/22 16:25:45 [debug] 90985#0: *2 epoll add event: fd:34 op:1 ev:80002001
2024/03/22 16:25:45 [debug] 90985#0: *2 add cleanup: 00005A6A964C8870
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket connect: 0
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] client.lua:737: lookup(): WILL QUERY gobolinux.org
2024/03/22 16:25:45 [debug] 90985#0: *2 send: fd:34 31 of 31
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket calling receive() method
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket read timeout: 2000
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket receive buffer size: 4096
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket read data: waiting: 0
2024/03/22 16:25:45 [debug] 90985#0: *2 recv: fd:34 -1 of 4096
2024/03/22 16:25:45 [debug] 90985#0: *2 recv() not ready (11: Resource temporarily unavailable)
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp recv returned -2
2024/03/22 16:25:45 [debug] 90985#0: *2 event timer add: 34: 2000:10471706
2024/03/22 16:25:45 [debug] 90985#0: *2 lua resume returned 1
2024/03/22 16:25:45 [debug] 90985#0: *2 lua thread yielded

(we then get this which I don't if it's at all related)

2024/03/22 16:25:45 [debug] 90985#0: posted event 00005A6A9868DF30
2024/03/22 16:25:45 [debug] 90985#0: *2 delete posted event 00005A6A9868DF30

what follows immediately is the next dispatch, which needs resolving example.com:

2024/03/22 16:25:45 [debug] 90985#0: *2 proxy_wasm http dispatch connecting...
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm tcp socket resolving: example.com
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm tcp socket using default resolver
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm tcp socket resolving...
2024/03/22 16:25:45 [debug] 90985#0: *2 lua creating new thread
2024/03/22 16:25:45 [debug] 90985#0: lua ref lua thread 00005A6A9678DC20 (ref 650)
2024/03/22 16:25:45 [debug] 90985#0: *2 code cache lookup (key='wasm_lua_resolver_chunkdd7007ea3c1caf68390921fc5ddb939c', ref=0)
2024/03/22 16:25:45 [debug] 90985#0: *2 code cache miss (key='wasm_lua_resolver_chunkdd7007ea3c1caf68390921fc5ddb939c', ref=0)
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm running lua thread (lctx: 00005A6A97EE2D90, L: 00005A6A95FFCD30, co: 00005A6A9678DC20)

but unlike the dispatch above where the three lines lua reset ctx, http lua finalize threads and lua run thread were next to each other, we get instead this:

2024/03/22 16:25:45 [debug] 90985#0: *2 lua reset ctx
2024/03/22 16:25:45 [debug] 90985#0: *2 http lua finalize threads
2024/03/22 16:25:45 [debug] 90985#0: *2 lua finalize socket
2024/03/22 16:25:45 [debug] 90985#0: *2 lua close socket connection
2024/03/22 16:25:45 [debug] 90985#0: *2 event timer del: 34: 10471706
2024/03/22 16:25:45 [debug] 90985#0: *2 reusable connection: 0
2024/03/22 16:25:45 [debug] 90985#0: *2 lua deleting light thread 00005A6A986C62C0 (ref 649)
2024/03/22 16:25:45 [debug] 90985#0: *2 lua caching unused lua thread 00005A6A986C62C0 (ref 649)
2024/03/22 16:25:45 [debug] 90985#0: *2 lua run thread, top:0 c:1

Looks like it deleted coroutine ending-62C0, which was the one from the gobolinux.org DNS resolution above!?

The query for example.com then proceeds is a similar fashion to the previous one... (even with the same fd:34)

2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:13: wasm lua resolver thread
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:20: wasm lua resolver using existing dns_client
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:42: wasm lua resolving "example.com"
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] [string "wasm_lua_resolver_chunk"]:44: has individual_toip: function: 0x5a6a962ef8f0
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket network address given directly
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket resolve retval handler
2024/03/22 16:25:45 [debug] 90985#0: *2 UDP socket 34
2024/03/22 16:25:45 [debug] 90985#0: *2 connect to 127.0.0.53:53, fd:34 #647
2024/03/22 16:25:45 [debug] 90985#0: *2 epoll add event: fd:34 op:1 ev:80002001
2024/03/22 16:25:45 [debug] 90985#0: *2 add cleanup: 00005A6A96281D30
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket connect: 0
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] client.lua:737: lookup(): WILL QUERY example.com
2024/03/22 16:25:45 [debug] 90985#0: *2 send: fd:34 29 of 29

...but then after a while we do get data coming from fd:34 and we get a query result for example.com.

2024/03/22 16:25:45 [debug] 90985#0: epoll timer: 837
2024/03/22 16:25:45 [debug] 90985#0: epoll: fd:34 ev:0001 d:00005A6A9888E191
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket handler for "/anything?", wev 0
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket read handler
2024/03/22 16:25:45 [debug] 90985#0: *2 event timer del: 34: 10471706
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket read data: waiting: 1
2024/03/22 16:25:45 [debug] 90985#0: *2 recv: fd:34 85 of 4096
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp recv returned 85
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket waking up the current request
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm rctx reused: 00005A6A964161C0 (r: 00005A6A97B8D430, main: 1)
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm wev handler "/anything?" - timeout: 0, ready: 1 (main: 1, count: 1, resp_finalized: 0, state: 2)
2024/03/22 16:25:45 [debug] 90985#0: *2 wasm resuming lua thread (lctx: 00005A6A97EE2D90, L: 00005A6A95FFCD30, co: 00005A6A9678DC20)
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp operation done, resuming lua thread
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket calling prepare retvals handler 00005A6A943F1684, u:00005A6A98CEEED0
2024/03/22 16:25:45 [debug] 90985#0: *2 lua udp socket receive return value handler
2024/03/22 16:25:45 [debug] 90985#0: *2 lua run thread, top:0 c:1
2024/03/22 16:25:45 [debug] 90985#0: *2 [lua] client.lua:739: lookup(): QUERY GOT RESULT

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.