clementfarabet / async Goto Github PK
View Code? Open in Web Editor NEWAn async framework for Lua/Torch.
An async framework for Lua/Torch.
This way both luv and lhttp_parser would have support for OS X in their Makefile.
Thanks
The http server seems to never trigger the onBody function, even when given an incoming valid request.
For example:
POST HTTP/1.1
Host: localhost:8080
Cache-Control: no-cache
{"title":"Hello World!","body":"This is my first post!"}
The request.body field is just an empty table, and I see that onBody is not triggered.
When you run "test-json-server.lua", if any rogue client connects to that server and sends random data, the server crashes, rather than disconnecting that client and continuing with its business, like a good server is supposed to do :)
here's a test that demonstrates it. Node idea where it comes from yet.
There are some cases that async does not parse properly for multipart/form-data request.
e.g.
According to the code, async searches "Content-Disposition" header in a case sensitive way, but it is case insensitive according to RFC.
Also, according to this line async expects "Content-Type" line is located after "Content-Disposition" line, but the order is not strict, so it can be reverted.
Relevant to async. Memory still leaks.
I am just calling uv.read_stop explicitly (in handle.lua), once reading has started, before shutting down.
It's a hacky way, but it solves leaking. If you want I can send in a pull request.
I have the following code for the TCP server (modified from the example) in a file named tcp-server.lua
.
async = require 'async'
async.tcp.listen({host='localhost', port=4321}, function(client)
-- Receive:
client.ondata(function(chunk)
-- Data:
print('received: ' .. chunk)
-- Reply:
client.write('1')
end)
-- Done:
client.onend(function()
print('client gone...')
end)
end)
async.go()
I have the following for a function that retrieves the message sent by the server.
async = require 'async'
function get_msg()
local msg = nil
async.tcp.connect({host='localhost', port=4321}, function(client)
-- Write something
client.write('something .. ')
-- Callbacks
client.ondata(function(chunk)
print('received: ' .. chunk)
msg = chunk
client.close()
end)
-- Done:
client.onend(function()
print('connection closed...')
end)
async.setTimeout(10, function()
if not silent then
print('timed out')
end
client.close()
end)
end)
async.go()
return msg
end
I ran th tcp-server.lua
in one terminal and then calling get_msg()
in a th shell (in another process). The result is
luajit: src/unix/stream.c:1074: uv_shutdown: Assertion `((stream)->io_watcher.fd) >= 0' failed.
Aborted (core dumped)
If I comment out the part starting with async.setTimeout
then it will return the value as intended. Also if I comment out client.close()
in ondata
then it will work as well.
The intention is to return the value as soon as possible, but if the server takes too long then return nil on timeout. I would think this is a fairly common scenario? Is this not the right way to do it?
The error seems to be coming up when make
is run and the error originates from darwin-proctitle.c
. The most useful part of the stack trace seems to be:
/System/Library/Frameworks/ApplicationServices.framework/Frameworks/HIServices.framework/Headers/Processes.h:415:1:
note: 'GetCurrentProcess' has been explicitly marked deprecated here MacGetCurrentProcess(ProcessSerialNumber * PSN)
AVAILABLE_MAC_OS_X_VERSION_10_0_AND_LATER_BUT_DEPRECATED_IN_MAC_OS_X_VERSION_10_9;
The current automatic compile settings uses "-Werror" which breaks after seeing FALLTHROUGH.
There are three instances of this, could someone please change them in the master?
async/lhttp_parser/http-parser/http_parser.c
Line 2095 in 293348b
Depends on uv_getaddrinfo.
bug stems from the two makefiles (http_parser) and (luv) which have the paths to luajit headers hard-coded.
Is it possible to use async for ipv6?
I tried something like this:
async = require 'async'
async.tcp.listen({host='::', port=8088}, function(client)
client.ondata(function(chunk)
print('received: ' .. chunk)
client.write('thanks!')
end)
client.onend(function()
print('client gone...')
end)
end)
async.go()
strace show that bind(10, {sa_family=AF_INET, sin_port=htons(8088), sin_addr=inet_addr("255.255.255.255")}, 16) = 0
And 255.255.255.255 seems wrong...
I tried luv:
local server = uv.new_tcp()
uv.tcp_bind(server, host, port)
server:listen(128, function(err)
assert(not err, err)
local client = uv.new_tcp()
server:accept(client)
on_connection(client)
end)
And it supports ipv6.
But if I reinstall async this luv example stop work with:
attempt to call method 'listen' (a nil value)
stack traceback:
test_luv.lua:8: in function 'create_server'
test_luv.lua:22: in main chunk
[C]: in function 'dofile'
...arev/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406680
Our system is running ubuntu 14.04
We've installed torch7, when we execute:
t7> network = torch.load('cifar.net')
(cifar.net is a neural network file)
we get this error:
th>network= torch.load('cifar.net')
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:294: unknown object
stack traceback:
[C]: in function 'error'
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:294: in function </home/mijail/torch/install/share/lua/5.1/torch/File.lua:190>
[C]: in function 'read'
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:270: in function 'readObject'
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:288: in function 'readObject'
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:272: in function 'readObject'
/home/mijail/torch/install/share/lua/5.1/torch/File.lua:311: in function 'load'
[string "network= torch.load('cifar.net')"]:1: in main chunk
[C]: in function 'xpcall'
/home/mijail/torch/install/share/lua/5.1/trepl/init.lua:648: in function 'repl'
...jail/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:185: in main chunk
[C]: at 0x00406670
When trying to build async on Ubuntu 17.10 the build fails at:
make[1]: Entering directory '/home/lintujuh/torch/async/lhttp_parser'
CPPFLAGS=-fPIC make -C http-parser http_parser.o
make[2]: Entering directory '/home/lintujuh/torch/async/lhttp_parser/http-parser'
cc -fPIC -I. -DHTTP_PARSER_STRICT=0 -Wall -Wextra -Werror -O3 -c http_parser.c
http_parser.c: In function ‘http_parser_parse_url’:
http_parser.c:2093:18: error: this statement may fall through [-Werror=implicit-fallthrough=]
found_at = 1;
~~~~~~~~~^~~
http_parser.c:2096:7: note: here
case s_req_server:
^~~~
cc1: all warnings being treated as errors
Makefile:35: recipe for target 'http_parser.o' failed
It looks like the dependency to penlight is missing in the rockspec.
Trying to install async through luarockt on os x yosemite and I get the following error:
luarocks install async
Installing https://raw.githubusercontent.com/torch/rocks/master/async-scm-1.rockspec...
Using https://raw.githubusercontent.com/torch/rocks/master/async-scm-1.rockspec... switching to 'build' mode
Cloning into 'async'...
remote: Counting objects: 203, done.
remote: Compressing objects: 100% (176/176), done.
remote: Total 203 (delta 20), reused 167 (delta 20)
Receiving objects: 100% (203/203), 395.29 KiB | 435.00 KiB/s, done.
Resolving deltas: 100% (20/20), done.
Checking connectivity... done.
make LUA_BINDIR=/usr/local/bin LUA_LIBDIR=/usr/local/lib LUA_INCDIR=/usr/local/include
/Applications/Xcode.app/Contents/Developer/usr/bin/make -C lhttp_parser LUA_BINDIR=/usr/local/bin LUA_LIBDIR=/usr/local/lib LUA_INCDIR=/usr/local/include
cc -c lhttp_parser.c -o lhttp_parser.o -Ihttp-parser -I /usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -Wall -Werror -fPIC
CPPFLAGS=-fPIC /Applications/Xcode.app/Contents/Developer/usr/bin/make -C http-parser http_parser.o
cc -fPIC -I. -DHTTP_PARSER_STRICT=0 -Wall -Wextra -Werror -O3 -c http_parser.c
cc -bundle -o lhttp_parser.so lhttp_parser.o http-parser/http_parser.o -lm -lpthread -lluajit -L /usr/local/lib
Undefined symbols for architecture x86_64:
"_luaL_setfuncs", referenced from:
_luaopen_lhttp_parser in lhttp_parser.o
"_lua_callk", referenced from:
_lhttp_parser_on_message_begin in lhttp_parser.o
_lhttp_parser_on_url in lhttp_parser.o
_lhttp_parser_on_header_field in lhttp_parser.o
_lhttp_parser_on_header_value in lhttp_parser.o
_lhttp_parser_on_headers_complete in lhttp_parser.o
_lhttp_parser_on_body in lhttp_parser.o
_lhttp_parser_on_message_complete in lhttp_parser.o
...
"_lua_getuservalue", referenced from:
_lhttp_parser_on_message_begin in lhttp_parser.o
_lhttp_parser_on_url in lhttp_parser.o
_lhttp_parser_on_header_field in lhttp_parser.o
_lhttp_parser_on_header_value in lhttp_parser.o
_lhttp_parser_on_headers_complete in lhttp_parser.o
_lhttp_parser_on_body in lhttp_parser.o
_lhttp_parser_on_message_complete in lhttp_parser.o
...
"_lua_setuservalue", referenced from:
_lhttp_parser_new in lhttp_parser.o
"_lua_tointegerx", referenced from:
_lhttp_parser_parse_url in lhttp_parser.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[1]: *** [lhttp_parser.so] Error 1
make: *** [lhttp_parser/lhttp_parser.so] Error 2
Error: Build error: Failed building.
Not urgent, but it would be nice to have the full http stack (server + client), using http_parser.
I use this repo to have torch7 on windows.
when i use this luarocks install async
to install async
it shows:
Installing https://raw.githubusercontent.com/torch/rocks/master/async-scm-1.rockspec
Cloning into 'async'...
remote: Counting objects: 210, done.
remote: Compressing objects: 100% (184/184), done.
Rremote: Total 210 (delta 19), reused 172 (delta 19), pack-reused 0eceiving objects: 91% (192/210), 403.80 KiB | 95.00 Receiving objects: 93% (196/210), 403.80 KiB | 95.00 KiB/s
Receiving objects: 100% (210/210), 403.80 KiB | 95.00 KiB/s, done.
Resolving deltas: 100% (19/19), done.
nmake LUA_BINDIR=E:/torch/distro-win/./install/bin LUA_LIBDIR=E:/torch/distro-win/./install/lib LUA_INCDIR=E:/torch/distro-win/./install/include
Microsoft (R) 程序维护实用工具 12.00.21005.1 版
版权所有 (C) Microsoft Corporation。 保留所有权利。
makefile(1) : fatal error U1000: fatal error U1000: Syntax error: missing in macro call ")"
Stop.
Error: Failed installing dependency: https://raw.githubusercontent.com/torch/rocks/master/async-scm-1.rockspec - Build error: Failed building.
the problem is here:
build = {
type = "command",
build_command = "$(MAKE) LUA=$(LUA) LUA_BINDIR=$(LUA_BINDIR) LUA_LIBDIR=$(LUA_LIBDIR) LUA_INCDIR=$(LUA_INCDIR)",
on Windows there is no make
but cmake
or nmake
. The luarocks install async
could not recongize Makefile
file on windows.
Thus to build the async from source, need to have a 'cmake' like build_command
.
how is it enable to enable cors?
I have a python TCP server adapted from the example in https://docs.python.org/2/library/socketserver.html.
class CommandTCPHandler(SocketServer.StreamRequestHandler):
def handle(self):
# self.rfile is a file-like object created by the handler;
# we can now use e.g. readline() instead of raw recv() calls
self.data = self.rfile.readline().strip()
print "{} wrote:".format(self.client_address[0])
print self.data
# Likewise, self.wfile is a file-like object used to write back
# to the client
self.wfile.write('123')
if __name__ == "__main__":
HOST, PORT = "localhost", 1234
# Create the server, binding to localhost on port 1234
server = SocketServer.TCPServer((HOST, PORT), CommandTCPHandler)
# Activate the server; this will keep running until you
# interrupt the program with Ctrl-C
server.serve_forever()
I have a client written in lua with async
async = require 'async'
for i=1,10 do
async.tcp.connect({host='127.0.0.1', port=1234}, function(client)
-- Write something
client.write('hello there .. ')
-- Callbacks
client.ondata(function(chunk)
print('received: ' .. chunk)
end)
-- Done:
client.onend(function()
print('connection closed...')
end)
async.setTimeout(1000, function()
print('timed out')
client.close()
end)
end)
async.go()
end
print('done')
I ran the python server and then the lua client. The client never receives anything and times out all 10 times, while the server got 'hello there' all 10 times.
$> th tcp-client.lua
timed out
timed out
timed out
timed out
timed out
timed out
timed out
timed out
timed out
timed out
done
Server log:
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
127.0.0.1 wrote:
hello there ..
Depends on uv_spawn.
Hi guys, do you have an example of how to set up a TCP server on node.js to work with the client in here?
The example json server code does not seem to be working with the python requests library as a client, fails with a "Bad json request" error.
SERVER IN LUA:
local async = require 'async'
require 'nn'
async.json.listen({host='0.0.0.0', port=8082}, function(req,res)
print('request:',req)
res({
msg = 'my answer:',
attached = 'pretty pretty pretty goooood.'
})
collectgarbage()
print(collectgarbage("count") * 1024)
end)
async.go()
CLIENT IN PYTHON:
import requests
import json
payload = {'msg':'Hey', 'attached':'How are you'}
requests.get('xx.xx.xx.xx:8082', json.dumps(payload))
I've noticed that an async server very slowly leaks memory. On my async server, it leaks about 200 bytes a minute.
This is not much at all, but if your server is running for a few days/months it'll eventually bloat up.
I ruled out cjson and any other modules in the server. I tried explicit garbage collection. They are not the issue.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.