Comments (32)
I'm using async.http
from async.
ok it leaks in the http module. And the leaks are quite rapid. Earlier I only had moderate load on the server (1 request/second average), but as I am load testing it, it's quite rapidly leaking memory. So, the leak is definitely in the http request/response. I'll try to post a minimal example.
from async.
Minimal example:
If you run async/tests/test-http-server.lua
and then run this gist
https://gist.github.com/soumith/8150139
You'll see a rapid increase in memory usage.
I also tried explicitly garbage collecting, that's not the issue.
from async.
I see that there's a test-memleaks.lua and there is a constant leakage when I run that as well.
Are the leaks a known issue? @ocallaco @clementfarabet ?
from async.
I think I saw that at some point....but not recently ....except with the search engine ...but that was a java issue
Peter
On 27 Dec 2013, at 22:03, Soumith Chintala [email protected] wrote:
I see that there's a test-memleaks.lua and there is a constant leakage when I run that as well.
Are the leaks a known issue? @ocallaco @clementfarabet ?—
Reply to this email directly or view it on GitHub.
from async.
@soumith , is the leak also occurring with plain tcp servers?
The test-memleaks.lua script was testing process leaks / how many processes can you spawn simultaneously. It was misnamed, I just patched it.
from async.
@clementfarabet No, it only happens with http servers. I suspected lhttp_parser, but I looked through the code and there's no real malloc/free happening there.
from async.
@clementfarabet actually, plain tcp servers are leaking memory. I just didn't test it well enough earlier.
Here's a gist to run as a client: https://gist.github.com/soumith/8163788
Run tests/test-tcp-server.lua to check the leaking behavior.
from async.
Which machine are you running that on? Is it your own or one of our
servers?
Peter
On Sat, Dec 28, 2013 at 8:24 PM, Soumith Chintala
[email protected]:
@clementfarabet https://github.com/clementfarabet actually, plain tcp
servers are leaking memory. I just didn't test it well enough earlier.
Here's a gist to run as a client: https://gist.github.com/soumith/8163788
Run tests/test-tcp-server.lua to check the leaking behavior.—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31304107
.
from async.
its one of my own. a standard ubuntu 13.10, nothing out of the ordinary.
Are you guys not seeing the leaks on your side if you try reproduce the gists?
from async.
Oh.. Ok then.... if it was one of ours you'd have been banging up against
the firewall....I didn't see it on mine, I'll try on another one
Peter
On Sat, Dec 28, 2013 at 10:21 PM, Soumith Chintala <[email protected]
wrote:
its one of my own. a standard ubuntu 13.10, nothing out of the ordinary.
Are you guys not seeing the leaks on your side if you try reproduce the
gists?—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31306228
.
from async.
hmmm, thats weird. what OS are you guys running on the servers? I want to eliminate anything that is exclusive to me.
I'll try it on an OSX machine later today, and another linux box.
from async.
I'm running ubuntu 13 at my office here... the servers are ubuntu 12 ...
haven't tried on them yet, and before you do let me know, so I can make
some room for you...:-)
Peter
On Sat, Dec 28, 2013 at 10:44 PM, Soumith Chintala <[email protected]
wrote:
hmmm, thats weird. what OS are you guys running on the servers? I want to
eliminate anything that is exclusive to me.I'll try it on an OSX machine later today, and another linux box.
—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31306568
.
from async.
i dont think i'm on the same page. I dont have access to your servers, I dont even work for you guys :)
from async.
with both the http gist, and the tcp gist, I see about 1MB/s leaking as long as the client is sending requests. If the client stops sending requests, the memory usage stays constant.
from async.
Ahhh... that would explain it... I was being cagey :-) We haven't been
introduced, but I assumed Clement was aware of you? May I ask who you are
working for and what you are doing?
I'm teh DB/Server/Search/Anything weird guy :-), and based in the UK
Peter
On Sat, Dec 28, 2013 at 10:47 PM, Soumith Chintala <[email protected]
wrote:
i dont think i'm on the same page. I dont have access to your servers, I
dont even work for you guys :)—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31306606
.
from async.
yes, me and clement worked at the NYU lab, but I work for museami now, he did mention you before. I've been building some backend infrastructure for one of my projects, and clement mentioned that you guys are using this in production, so I started using it (I love node/libuv). I haven't had any issues, until I noticed that my server crashed multiple times over being out of memory, thats when I isolated the leaks to the async module and produced some gists.
from async.
Oh right... thanks. To be fair, clement and connell are the guys at that
end. I just gave it a go here to see what the issue was/is. I'll try
another couple of machines tomorrow (its late in the UK), and see what I
come up with.
Are you using torch to run that, or something else?
P
On Sat, Dec 28, 2013 at 10:52 PM, Soumith Chintala <[email protected]
wrote:
yes, me and clement worked at the NYU lab, but I work for museami now, he
did mention you before. I've been building some backend infrastructure for
one of my projects, and clement mentioned that you guys are using this in
production, so I started using it (I love node/libuv). I haven't had any
issues, until I noticed that my server crashed multiple times over being
out of memory, thats when I isolated the leaks to the async module and
produced some gists.—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31306704
.
from async.
thanks a lot, I appreciate it. I am using torch to run it.
from async.
Finally... must to bed.. but what are you using to measure the leaks? Are
you sure it isn't Lua/Torch hanging onto memory for reuse?
Peter
P.S. Speak tomorrow
On Sat, Dec 28, 2013 at 10:59 PM, Soumith Chintala <[email protected]
wrote:
thanks a lot, I appreciate it. I am using torch to run it.
—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31306830
.
from async.
i'm just checking the memory usage in "top/htop".
Lua/Torch sure isn't hanging onto the memory for reuse, I put in explicit garbage collection calls in there and checked, to make sure that's not the problem.
from async.
"The test-memleaks.lua script was testing process leaks / how many processes can you spawn simultaneously. It was misnamed, I just patched it."
@clementfarabet but shouldn't the memory be reclaimed after the process exits? is there something I'm missing. I see that it creates a process with the "ls" command. After the process finishes, I'd expect the memory to be reclaimed, but that's not happening either.
from async.
ugh, sorry for the spam, my last comment is wrong. i deleted it.
As I said, I dont understand why with async, when a process spawns and finishes, there is still some memory leaked.
When I run test-memleaks.lua,
I put a break in there for it to stop spawning processes after 1000 spawns, and collecting garbage.
But my memory usage remains stagnant at 9.2MB.
It's the same issue with http/tcp servers, new client requests when processed leak memory that is not garbage collected.
from async.
One thing... purely looking at the code, and knowing what I have seen in
the past....(read... untested :-) ), TCP can sometimes take a while to
close a connection fully. So with that lua script, you are reassigning to
teh same connection each loop. Is it possible the previous connection has
not completed its tear down process before getting reused?
I know it seems dumb, but you could try putting the connections into an
array/table as opposed to reusing the variable, and maybe increasing the
sleep time to something dumb like 0.5, just to prove/disprove this point
I would be looking today... but I am recovering a server that went awol
last night....ooops
P
On Sun, Dec 29, 2013 at 4:57 AM, Soumith Chintala
[email protected]:
ugh, sorry for the spam, my last comment is wrong. i deleted it.
As I said, I dont understand why with async, when a process spawns and
finishes, there is still some memory leaked.When I run test-memleaks.lua,
I put a break in there for it to stop spawning processes after 1000
spawns, and collecting garbage.
But my memory usage remains stagnant at 9.2MB.It's the same issue with http/tcp servers, new client requests when
processed leak memory that is not garbage collected.—
Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-31311475
.
from async.
thats definitely not it, leaks happen over time on a real-world load with plenty of time for tear-down.
from async.
i tested it on a fresh OSX box, and another linux box.
i'm not even sure where the leak arises from. i pretty much give up.
i dont know why you guys haven't noticed it/reproduced it on your servers, def happens at my end.
from async.
Hey @soumith , sorry I was taking some time off these past few days...
I'm definitely seeing the problem on our servers. With TCP servers only, it's extremely slow to leak though, so it hasn't been a problem (we crash before, for other reasons, and we auto-restart on die). But as soon as I get a chance I'll be looking into it: I think it's got to be at the luv - lua interface level, probably really hard to catch...
C.
from async.
you deserve your holiday, work too hard.
well, i'm glad that you could reproduce it, I thought it was only on my machines.
I've been looking into it as well.
tty when you're back, i'm also doing the same, restarting after a while because I have persistent states via redis
from async.
Finally fixed. Clement's hints on the luv bug tracker helped. thanks.
from async.
Wait, is it really fixed? Removing the .close() works, but it's a bit aggressive, I still think there's a bug in Luv. Tim is looking into it.
from async.
i fixed it in luv, i sent in a pull request to tim as well. you dont need to remove the .close()
from async.
Just saw it, that's awesome!
On Thu, Jan 30, 2014 at 12:25 PM, Soumith Chintala <[email protected]
wrote:
i fixed it in luv, i sent in a pull request to tim as well. you dont need
to remove the .close()Reply to this email directly or view it on GitHubhttps://github.com//issues/9#issuecomment-33710680
.
from async.
Related Issues (20)
- read_start bug in luv HOT 3
- New leak, on process spawn HOT 3
- fails to install when luajit is installed in a non-standard location HOT 3
- problem installing async in os x
- example tcp server on node.js side HOT 2
- Not working with python requests library HOT 3
- Can't read from python TCP server HOT 1
- setting timeout triggers assertion error
- unknown object' error in the 'readObject' function while loading .net file in TORCH HOT 3
- Mac OS X 10.9 install error: 'GetCurrentProcess' is deprecated
- ipv6 HOT 2
- How to make a loop calll curl.get? HOT 1
- penlight dependency
- how to enable cors?
- async does not accept some valid formats in multipart/form-data request.
- bugs when building on windows
- Buld on Ubuntu 17.10 fails HOT 1
- Auto compile is breaking duw to FALLTHROUGH HOT 5
- http server seems to ignore incoming body (checked for POST) HOT 8
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from async.