GithubHelp home page GithubHelp logo

lua-resty-http's People

Contributors

acayf avatar agentzh avatar chenfengyuan avatar chobits avatar claresun avatar hellosa avatar liseen avatar qrof avatar smallfish avatar subnetmarco avatar wangchll avatar wendal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lua-resty-http's Issues

I have this code , its return NIL

local http = require("resty.http")
local hc = http:new()
local ok, code, headers, status, body = hc:request({
  url = "http://www.tokoled.net/uploads/1920.jpg",
  method = "GET",
  headers = { },
  body = ""
})
return ngx.say(body)

Are my code correct ? i have new project to migrate my other website to LUA+ OpenResty, but it need phpcurl, so i think this package is suitable to replace phpcurl function.

建议

建议各位大大,在 Synopsis 里面加个 get 的例子吧,这样新手也有一个好的example!

handshake failed

[error] 4733#0: *1 peer closed connection in SSL handshake, client: 172.16.48.28, server: localhost, request: "POST /lua/index HTTP/1.1", host: "172.16.48.31"

SSL handshake problem

I'm receiving the following error when trying to consume an SLL endpoint (the GitHub API):

2015/02/18 11:06:01 [error] 3087#0: *28889 SSL_do_handshake() failed (SSL: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol), client: 52.1.6.14, server: , request: "POST /oauth_url HTTP/1.1", host: "mashaper-github.p.mashape.com"

I found somewhere that this problem could be releated to this:

This error happens when OpenSSL receives something other than a ServerHello in a protocol version it understands from the server. It can happen if the server answers with a plain (unencrypted) HTTP. It can also happen if the server only supports e.g. TLS 1.2 and the client does not understand that protocol version. Normally, servers are backwards compatible to at least SSL 3.0 / TLS 1.0, but maybe this specific server isn't (by implementation or configuration).

HTTPS support ?

Hi,

First let me congrats you with great lib. I was just wondering if HTTPS was supported ? i do not think to have any luck with it. i managed to get http working ok.

thanks

-seb

local hc = http:new()
local ok, code, headers, status, body  = hc:request {
    url = "https://secure.com/auth",
    method = "POST", 
    scheme = "https",
    port = "443",
    headers = { ["Content-Length"] = string.len(post_args_escaped), ["Content-Type"] = "application/x-www-form-urlencoded" },
    body = post_args_escaped,
}

Example of big file proxy

I'm using lua with nginx and I need to do some piping. I need to auth in lua on 3rd party website and then download file but not to my server, but directly piping to user.

nginx ==> 3rdparty/auth
nginx <== 3rdparty: You are logged in
nginx ==> 3rdparty/get/file/x.tar.gz;
nginx <== 3rdparty: Here you are... binary data
nginx.on_3rdparty_chunk(data) ==> nginx.send_to_client(data)

Is it possible? Can you give a simple example?

“http.lua” this file has some problem

line 277~286 in http.lua:

local h = "\r\n"
for i, v in pairs(nreqt.headers) do
-- fix cookie is a table value
if type(v) == "table" and i == "cookie" then
v = table.concat(v, "; ")
end
h = i .. ": " .. v .. "\r\n" .. h
end

h = h .. '\r\n' -- close headers

this code snippet makes the request body into "\r\n",My recommendations are as follows:
old code:
local h = "\r\n"
new code:
local h = ""

bug: infinite loop in receivebody()

When http.lua receives chunked body, if upstream peer closes this socket, there may be an infinite loop in receivebody() funciton:

local function receivebody(sock, headers, nreqt)
...
    if t and t ~= "identity" then
        -- chunked
        while true do
            local chunk_header = sock:receiveuntil("\r\n")
            local data, err, partial = chunk_header()    <<< chunk_header may return nil, "closed"
            if not err then
                if data == "0" then
                    return body -- end of chunk
                else
                    local length = tonumber(data, 16)

                    -- TODO check nreqt.max_body_size !!

                    local ok, err = read_body_data(sock,length, nreqt.fetch_size, callback)
                    if err then
                        return nil,err
                    end
                end
            end
        end

resolver isn't needed ?

hi there,

the resolver in the request function is a redundant, IMHO.

as the tcpsock.connect API link will do the resolve-related thing, using a more effective mechanism provided by nginx itself. that resolver uses rbtree to cache the result and handles the expiration.

so maybe it's not needed? correct me if I'm wrong, thanks. :)

read status line failed read status line failed timeout

os:mac
docker image build by

FROM alpine
MAINTAINER edwin_uestc <[email protected]>

ENV LUA_SUFFIX=jit-2.1.0-beta1 \
    LUAJIT_VERSION=2.1 \
    NGINX_PREFIX=/opt/openresty/nginx \
    OPENRESTY_PREFIX=/opt/openresty \
    OPENRESTY_SRC_SHA1=1a2029e1c854b6ac788b4d734dd6b5c53a3987ff \
    OPENRESTY_VERSION=1.9.7.3 \
    VAR_PREFIX=/var/nginx

RUN sed -i 's/dl-cdn.alpinelinux.org/mirrors.ustc.edu.cn/' /etc/apk/repositories


RUN set -ex \
  && apk --no-cache add --virtual .build-dependencies \
    curl \
    make \
    musl-dev \
    gcc \
    ncurses-dev \
    openssl-dev \
    pcre-dev \
    perl \
    readline-dev \
    zlib-dev \
  \
  && curl -fsSL http://openresty.org/download/openresty-${OPENRESTY_VERSION}.tar.gz -o /tmp/openresty.tar.gz \
  \
  && cd /tmp \
  && echo "${OPENRESTY_SRC_SHA1} *openresty.tar.gz" | sha1sum -c - \
  && tar -xzf openresty.tar.gz \
  \
  && cd openresty-* \
  && readonly NPROC=$(grep -c ^processor /proc/cpuinfo 2>/dev/null || 1) \
  && ./configure \
    --prefix=${OPENRESTY_PREFIX} \
    --http-client-body-temp-path=${VAR_PREFIX}/client_body_temp \
    --http-proxy-temp-path=${VAR_PREFIX}/proxy_temp \
    --http-log-path=${VAR_PREFIX}/access.log \
    --error-log-path=${VAR_PREFIX}/error.log \
    --pid-path=${VAR_PREFIX}/nginx.pid \
    --lock-path=${VAR_PREFIX}/nginx.lock \
    --with-luajit \
    --with-pcre-jit \
    --with-ipv6 \
    --with-http_ssl_module \
    --without-http_ssi_module \
    --with-http_realip_module \
    --without-http_scgi_module \
    --without-http_uwsgi_module \
    --without-http_userid_module \
    -j${NPROC} \
  && make -j${NPROC} \
  && make install \
  \
  && rm -rf /tmp/openresty-* \
  && apk del .build-dependencies

RUN ln -sf ${NGINX_PREFIX}/sbin/nginx /usr/local/bin/nginx \
  && ln -sf ${NGINX_PREFIX}/sbin/nginx /usr/local/bin/openresty \
  && ln -sf ${OPENRESTY_PREFIX}/bin/resty /usr/local/bin/resty \
  && ln -sf ${OPENRESTY_PREFIX}/luajit/bin/luajit-* ${OPENRESTY_PREFIX}/luajit/bin/lua \
  && ln -sf ${OPENRESTY_PREFIX}/luajit/bin/luajit-* /usr/local/bin/lua

RUN apk --no-cache add \
    libgcc \
    libpcrecpp \
    libpcre16 \
    libpcre32 \
    libssl1.0 \
    libstdc++ \
    openssl \
    pcre

WORKDIR $NGINX_PREFIX

CMD ["nginx", "-g", "daemon off; error_log /dev/stderr info;"]


i am trying to build post request using following code



            local http = require "resty.http"
            local hc = http:new()

            local ok, code, headers, status, body  = hc:request {
                url = "http://wi.hit.edu.cn/cemr/",
                -- proxy = "202.118.253.110:80",
                timeout = 3000,
                method = "POST", -- POST or GET

                headers = { ["Upgrade-Insecure-Requests"]="1",["referer"]="http://wi.hit.edu.cn/cemr/",["cache-control"]="max-age=0",["connection"]="keep-alive",["user-agent"]="Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:48.0) Gecko/20100101 Firefox/48.0",["host"]="wi.hit.edu.cn",["Content-Length"]="705",["Content-Type"] = "application/x-www-form-urlencoded" },
                -- body = "source=1. 患者既往有高血压病史,最高达180/100mmHg,口服北京降压灵、利血平治疗,有脑梗死病史,遗留语笨及左侧肢体无力.2. 门诊行头CT检查,显示左侧侧脑室体旁低密度病灶,以脑梗死收入我科."
                -- -- headers = { UserAgent = "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11"}
            }

            ngx.say(ok)
            ngx.say(code)
            ngx.say(status)
            ngx.say(body)

outcome i am looking for is to realize post request in the following page
http://wi.hit.edu.cn/cemr/,
the full request info i get from firefox debug tools is

Host: wi.hit.edu.cn
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:48.0) Gecko/20100101 Firefox/48.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: zh-CN,zh;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Referer: http://wi.hit.edu.cn/cemr/
Connection: keep-alive
Upgrade-Insecure-Requests: 1
Cache-Control: max-age=0

form-data  
source:"1.+患者既往有高血压病史,最高达180/100mmHg,口服"北京降压灵、利血平"治疗,有脑梗死病史,遗留语笨及左侧肢体无力.
2.+门诊行头CT检查,显示左侧侧脑室体旁低密度病灶,以"脑梗死"收入我科."

full error is like this

2016/09/17 05:53:47 [error] 7#0: *10 lua tcp socket read timed out, client: 172.17.0.1, server: , request: "GET /cemr HTTP/1.1", host: "localhost:8080"

but curl works
curl -d "source=患者既往有高血压病史,最高达180/100mmHg,口服北京降压灵、 利血平治疗,有脑梗死病史,遗留语笨及左侧肢体无力." http://wi.hit.edu.cn/cemr/ > result.html

示例代码是不是有问题?

示例给出了可以在request里添加 headers = { Cookie = "ABCDEFG", ["Content-Type"] = "application/x-www-form-urlencoded" }这个元素,很明显这里添加的cookie不是table类型,而翻看http.lua源码的296行,添加的cookie元素必须是一个table类型才会被正确解析,不知道我的理解对不对?

请求错误返回code:read status line failed read status line failed timeout

使用代码形式如下:
local ok, code, headers, status, body =hc:request {
url = url,
method = "POST",
body = postBody,
headers = {["Content-Type"] = "application/x-www-form-urlencoded" }
}
返回信息:
code ='read status line failed read status line failed timeout'
服务端http状态码为499,
我看前面有这个提问,但是回复我的我没有理解意思。
麻烦看下这个问题吧,谢谢!

Error - sock connected failed no resolver defined to resolve

Hello

I'm very new and maybe this is the issue :)

Im trying to just make a test and then follow the example and change a little bit

location /test {
            content_by_lua '
                local http = require("resty.http")
                local hc = http:new()

                local ok, code, headers, status, body  = hc:request {
                    url = "http://www.google.com",
                    method = "GET",
                    headers = { ["Content-Type"] = "application/json" },
                    body = "",
                }

                ngx.say(ok)
                ngx.say(code)
                ngx.say(body)
            ';
        }

but Im getting this error:


nil
sock connected failed no resolver defined to resolve "www.google.com"
nil

What is wrong or what Im missing?

thanks

multipart/form-data post error

依旧是一个 multipart/form-data的出错问题.....

现在拼装了一个请求

  • Table String:
    | { [table: 0x41dead18, 13 item(s)]
    | | *max_body_size => 1073741824
    | | *fetch_size => 16384
    | | *uri => /submit
    | | *timeout => 100
    | | *headers =>
    | | { [table: 0x41de7e60, 6 item(s)]
    | | | *host => localhost
    | | | *content-type => multipart/form-data; boundary=BOUNDARY-OOYZUGIKJHFFZAFM-BOUNDARY
    | | | *connection => close, TE
    | | | *content-length => 34761
    | | | *te => trailers
    | | | *user-agent => resty.http/0.2
    | | }
    | | *host => localhost
    | | *method => POST
    | | *scheme => http
    | | *authority => localhost:9333
    | | *url => http://localhost:9333/submit
    | | *body => --BOUNDARY-OOYZUGIKJHFFZAFM-BOUNDARY
    content-disposition: form-data; name="myfile"; filename="portrait"
    content-type: application/octet-stream
    content-transfer-encoding: binary

����JFIF``��C

可是调用发送的时候

local ok, code, headers, status, body = hc:request {
url = weedfsurl.submit,
timeout = 100,
method = "POST",
headers = rq.headers,
body = rq.source
}

返回结果是 ,我这里的存储后端是 weedfs , 使用的端口是 127.0.0.1:9333 的了.
这里竟然socket链接不上......

sock connected failed localhost could not be resolved (110: Operation timed out)

这里发送请求貌似不靠谱啊...针对multipart/form-data的请求,请大家提供一些思路吧..

谢谢.

你好,关于POST发送出去的body内容缺失,不知道是什么问题

lua代码

local url_string = 'http://localhost/test.php'
local http = require("resty.http")
local hc = http:new()
local ok, code, headers, status, body   = hc:request {
    url = url_string,
    timeout = 2000,
    method = "POST",
    keepalive = false,
    body = "1234567890"
}

php代码

<?php
file_put_contents('/tmp/t.txt', file_get_contents('php://input'));

发送的body是1234567890,接收到的确是12345678,

Releases

Can you start making releases for every new version?

It makes deployment a lot easier in installation scripts.

high cpu usage when using lua-http-resty

I am using NGINX 1.9.13 with lua-resty-http.
The CPU load is abnormally too hight when making HTTP requests via the lua http resty module (
https://github.com/liseen/lua-resty-http) :

PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
8446 root   20   0  209m 142m 1748 R 99.1  3.7   0:31.27 nginx

The more the body of the request is big, the higher the CPU. In this case the body was 20MB. If we add multiple requests,

it will kill the CPU.

QUESTION: Is there a way to reduce the CPU load when using LUA http resty?

I added below the NGINX Lua code as well as the NGINX configuration:
NGINX is used a front-end proxy towards an internal web server.

The below lua code (content.lua) is called multiple times from NGINX server for each curl http request :

local ok, code, headers, status, body = nil, 0, nil, 0, nil
local Http = require("resty.http") -- https://github.com/liseen/lua-resty-http
local Httpc = Http.new()
local req_headers = ngx.req.get_headers()    
req_headers["range"] = "bytes=0-20000000" -- example of range
req_headers["connection"] = "keep-alive"
req_headers["if-range"] = nil

local req_params = {
    url = "http://127.0.01:8099" .. ngx.var.request_uri,
    method = 'GET',
    headers = req_headers,
    keepalive = 30000 -- 30 seconds
}

req_params["code_callback"] =
   function(statuscode)
     code = statuscode
   end

req_params["header_callback"] =
   function(headers)
       if headers and headers["x-custom"] and ngx.re.match(headers ["x-custom"],"CUSTOM_STR") then
          --increment a counter 
       else
          --increment another counter
       end         
    end

req_params["body_callback"] =
    function(data, chunked_header, ...)
        if chunked_header or code ~= 206 then return end
        ngx.print(data)
        ngx.flush(true)
    end

ok, code, headers, status, body = Httpc:request(req_params)
local start, stop = 0, 20000000 -- example of value
if (code == 206) and body then
   ngx.print(string.sub(body, start, stop))
end

Below is my NGINX configuration:

worker_processes  auto;
events {
 worker_connections  2048;
 multi_accept on;
 use epoll;
}

http {
  include ./mime.types;
  default_type application/octet-stream;

  proxy_cache_bypass 1;
  proxy_no_cache 1;
  proxy_set_header Host $http_host;
  proxy_http_version 1.1;
  proxy_set_header Connection "";
  proxy_buffering off;
  proxy_buffer_size 128k;
  proxy_buffers 100 128k;
  sendfile on;
  tcp_nopush on;
  keepalive_timeout 10;
  reset_timedout_connection on;

  upstream backend {
    server 127.0.0.1:8090;
    keepalive 20;
  }

  server {
    listen 192.168.0.10:8080;

    location ~ \.(mp4|jpg)$ {  
      lua_check_client_abort on;
      lua_code_cache on;
      lua_http10_buffering off;

      #request the lua code  
      content_by_lua_file '/opt/location/content.lua';
    }
  }

  server {
    listen 127.0.0.1:8099;
    access_log off;        

    location / {
       proxy_pass http://backend;
    }
  }

}

Proxy configuration

How to configure a connection to e.g. Squid proxy which runs on localhost? I need to send requests to an endpoint on the internet through this proxy. In the proxy example, it seems that httpc:connect(HOST, PORT) is for endpoint configuration. So how should I tell LUA to use the proxy on the specific port on the localhost? Is it possible?

local http = require "resty.http"
local httpc = http.new()

httpc:set_timeout(500)
local ok, err = httpc:connect(HOST, PORT)

if not ok then
  ngx.log(ngx.ERR, err)
  return
end

httpc:set_timeout(2000)
httpc:proxy_response(httpc:proxy_request())
httpc:set_keepalive()

header组装性能问题

local h = ""
for i, v in pairs(nreqt.headers) do
-- fix cookie is a table value
if type(v) == "table" and i == "cookie" then
v = table.concat(v, "; ")
end
h = i .. ": " .. v .. "\r\n" .. h
end

h = i .. ": " .. v .. "\r\n" .. h 如果headers比较多是否会有性能问题,为什么不考虑使用table.concat?

request failed and returns read status line failed

When I try to use
local ok, code, headers, status, body = httpClient:request {
url = url,
timeout = 3000,
scheme = 'http',
method = "POST",
headers = {["Content-Type"] = "application/x-www-form-urlencoded" },
body = postBody
}
one request returned the result that
ok = nil
code ='read status line failed read status line failed timeout'
As the code described, I presume it happened because the server didn't response anything in 3 seconds which I set in the code. But I am not sure about it, and I can't check how the server handle my request since it's one of our clients' server. But according to the client, they did response for the request. So I come here to ask for help. Is there any demonstrations about this error? Or anyone could help me with the reason. Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.