GithubHelp home page GithubHelp logo

drakma's People

Contributors

aeronotix avatar avodonosov avatar cage2 avatar criesbeck avatar g000001 avatar galdor avatar gefjon avatar gitter-badger avatar hanshuebner avatar ieure avatar kaiserprogrammer avatar kisp avatar llibra avatar lokedhs avatar mdbergmann avatar orivej avatar phmarek avatar ruricolist avatar rwiker avatar shamazmazum avatar slyrus avatar stassats avatar tmccombs avatar turtle-bazon avatar vityok avatar vseloved avatar wukix avatar xh4 avatar zellerin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

drakma's Issues

Drakma has a non-obvious issue with the url http://rwis.mdt.mt.gov/scanweb/Camera.asp?Pageid=Camera&Units=English&Groupid=301000&Siteid=301001&Senid=&Wxid=3011&Mapid=&DisplayClass=Java&SenType=All&HEndDate=&Zoneid=&Mode=&Sprayerid=&Dvid=&CD=9%2F12%2F2013+7%3A46%3A34+PM

drakma seems to have an issue with the url

http://rwis.mdt.mt.gov/scanweb/Camera.asp?Pageid=Camera&Units=English&Groupid=301000&Siteid=301001&Senid=&Wxid=3011&Mapid=&DisplayClass=Java&SenType=All&HEndDate=&Zoneid=&Mode=&Sprayerid=&Dvid=&CD=9%2F12%2F2013+7%3A46%3A34+PM
dramka-issue.txt

as exemplified by the attached traceback.

curl pulls in the page nicely.

Warnings in LispWorks

Could we in HTTP-REQUEST add the following?

(declare (ignorable certificate key certificate-password verify max-depth ca-file ca-directory))

Also, maybe the docs should mention that these are not available on LispWorks.

Thanks,
Edi.

drakma and cl-oauth..

Hi,
I'm using cl-oauth to authenticate with twitter (it's in the cl-twitter package, also on github).

I received a complaint from a quicklisp user that he couldn't authenticate with twitter.

After looking into this a bit I realized that drakma is encoding the uri on get requests. Now, cl-oauth passes in an encoded uri so this uri gets encoded twice. This was never a problem for me because I was using v 1.2.3 of drakma where this didn't happen.

Here's the code diff (plus my proposed change :)
========================== v 1.2.3 ==================================
(when (and (not parameters-used-p)
parameters)
(setf (uri-query uri)
;; append parameters to existing query of URI
(format nil "@[A]:*:[;&~]~A"
(uri-query uri)
(alist-to-url-encoded-string parameters external-format-out))))

===========================current github/v1.2.4=============
(when-let (all-get-parameters
(append (dissect-query (uri-query uri))
(and (not parameters-used-p) parameters)))
(setf (uri-query uri)
(alist-to-url-encoded-string all-get-parameters external-format-out)))
===============================my proposed change===================================================
(when (and (not parameters-used-p)
parameters)
(when-let (all-get-parameters
(append (dissect-query (uri-query uri))
(and (not parameters-used-p) parameters)))
(setf (uri-query uri)
(alist-to-url-encoded-string all-get-parameters external-format-out))))


A few more comments :

This is the handoff in cl-oauth :
(let* ((param-string-encoded (alist->query-string parameters :include-leading-ampersand nil :url-encode t)))
(case request-method
(:get
(apply #'drakma:http-request
(uri-with-additional-query-part uri param-string-encoded)
:method request-method
drakma-args))
(:post
(apply #'drakma:http-request
uri
:method request-method
:content param-string-encoded

drakma-args))

As you can see url-encode is set to t. That was because (I think !) previous versions required encoding and drakma wasn't providing any. Now, ideally tis flag s/b nil. However the issue the becomes the string splitting in dissect-query. This splits on "=" which is also the terminating symbol for the authentication string...
=======================session===================
(drakma::split-string "oauth_signature=oq37d1/qm[....]fIKb778=&include_entities=T&oauth_consumer_key=9[....]cYBg&oauth_token=206[...]Tt5SwRvCJqQWgR3ajEQpk&oauth_signature_method=HMAC-SHA1&oauth_timestamp=1325002586&oauth_nonce=3680613621135035286&oauth_version=1.0" "&")
("oauth_signature=oq37d1/qmFX0YuQUwxsgfIKb778=" "include_entities=T"
[....])

CL-USER> (drakma::split-string "oauth_signature=oq37d1/qmFX0YuQUwxsgfIKb778=" "=")
; compiling (DEFUN HTTP-REQUEST ...)
STYLE-WARNING: redefining DRAKMA:HTTP-REQUEST in DEFUN

("oauth_signature" "oq37d1/qm[...]778")

(I've elided some of the strings...).

I think my proposal resolves the issue. However, it would require a bit more work to get to what I think is the 'right' solution...

At this stage I'd like to get some feedback on whta you consider the right course of action before proceeding to submit a patch

Problem with escaping urls with #\? in query parameters

Drakma on puri:parse-uri with query string contains #? (and probably other characters) fails:

(drakma:http-request "https://www.google.com/?q=test?")

Parse error:URI "https://www.google.com/?q=test?" contains illegal character #\? at position 30.
   [Condition of type PURI:URI-PARSE-ERROR]

Backtrace:
  0: (PURI::.PARSE-ERROR #<unavailable argument> #<unavailable &REST argument>)
  1: ((FLET PURI::READ-TOKEN :IN PURI::PARSE-URI-STRING) #<unavailable argument> #<unavailable argument>)
  2: (PURI::PARSE-URI-STRING #<unavailable argument>)
  3: (SB-DEBUG::TRACE-CALL #<SB-DEBUG::TRACE-INFO PURI::PARSE-URI-STRING> #<FUNCTION PURI::PARSE-URI-STRING> "https://www.google.com/?q=test?")
  4: (PURI:PARSE-URI #<unavailable argument> :CLASS #<unavailable argument>)
  5: (DRAKMA:HTTP-REQUEST "https://www.google.com/?q=test?")
  6: (SB-INT:SIMPLE-EVAL-IN-LEXENV (DRAKMA:HTTP-REQUEST "https://www.google.com/?q=test?") #<NULL-LEXENV>)
  7: (EVAL (DRAKMA:HTTP-REQUEST "https://www.google.com/?q=test?"))

The problem is that query string encoding happens after parse, and it escapes #? as it should:

(drakma:url-encode "test?" :utf8)
"test%3F"

Versions used:
drakma-1.3.15
puri-20101006-git

Cannot get rid of Connection: close

Even if I add :protocol :HTTP/1.1 :keep-alive t :close nil to my `drakma:http-request' call I still get Connection: close added to the request. This makes me sad because I'm hitting an endpoint which will not work this way. Is there some other trick to getting Connection: close to not be written?

httpOnly & secure not parsed correctly in combination

When parsing this Set-Cookie line:

Set-Cookie: shssl=4058628; path=/; secure; HttpOnly

The resulting Cookie is:

"COOKIE shssl=4058628; path=/; domain=www.domain.de"

Which misses both features: 'HttpOnly' and 'secure'.

I traced the bug down to 'parse-set-cooie' which returns
(("Set-Cookie: shssl" "4058628" (("path" . "/") ("secure; HttpOnly"))))

instead of (("Set-Cookie: shssl" "4058628" (("path" . "/") ("secure")
("HttpOnly"))))

As far as I understood the code the problem is caused by
'read-name-value-pairs' of chunga.

I am using chunga 1.1.1 and drakma 1.2.9

http-request url-encoder is not called

It's not clear from the documentation, which are the cases, the :url-encoder parameter is called in. But such a call:

(drakma:http-request "http://ya.ru" :url-encoder 1)

Wont produce any error, because url-encoder is not called at all. Is this a feature?

Problem loading drakma with quicklisp

Hi!

Thanks for your work on drakma!

I tried to load drakma with the latest quicklisp into the latest SBCL. Unfortunately, there is a problem with CL+SSL.

Cheers,
Clemens

$ sbcl
This is SBCL 1.2.5, an implementation of ANSI Common Lisp.
More information about SBCL is available at <http://www.sbcl.org/>.

SBCL is free software, provided as is, with absolutely no warranty.
It is mostly in the public domain; some portions are provided under
BSD-style licenses.  See the CREDITS and COPYING files in the
distribution for more information.

* (ql:quickload :drakma)
To load "drakma":
  Load 1 ASDF system:
    drakma
; Loading "drakma"

debugger invoked on a SB-KERNEL:NAMESTRING-PARSE-ERROR in thread
#<THREAD "main thread" RUNNING {1003ED6613}>:
  parse error in namestring: logical namestring character which is not alphanumeric or hyphen:
  #\+
  CL+SSL
    ^

Type HELP for debugger help, or (SB-EXT:EXIT) to exit from SBCL.

restarts (invokable by number or by possibly-abbreviated name):
  0: [RETRY                        ] Retry ASDF operation.
  1: [CLEAR-CONFIGURATION-AND-RETRY] Retry ASDF operation after resetting the
                                     configuration.
  2: [ABORT                        ] Give up on "drakma"
  3:                                 Exit debugger, returning to top level.

(SB-IMPL::LOGICAL-WORD-OR-LOSE "cl+ssl")
0] 

Cookie parsing fails for admittelly bad cookies

I am writing against the Infoblox REST API. The moment I try to use a cookie-jar I get the following error:

end of file on #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}>
[Condition of type END-OF-FILE]
Backtrace:
0: (READ-BYTE #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}> T NIL)
1: (CHUNGA:READ-CHAR* #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}> T NIL)
2: (CHUNGA::READ-QUOTED-STRING #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}>)
3: (CHUNGA::READ-COOKIE-VALUE #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}> :SEPARATORS ";")
4: (CHUNGA:READ-NAME-VALUE-PAIR #<FLEXI-STREAMS::LIST-INPUT-STREAM {1004BF43C3}> :VALUE-REQUIRED-P T :COOKIE-SYNTAX T)
5: (DRAKMA::PARSE-SET-COOKIE "ibapauth="client=API,group=VCP-Meter,ctime=1538755942,timeout=7200,mtime=1538755942,ip=10.134.10.30,auth=LOCAL,user=openbook,Rw+KK2u56eG1OP13xpSkhnQgaQteIjrVJIs"; httponly..
6: (DRAKMA::GET-COOKIES ((:DATE . "Fri, 05 Oct 2018 16:12:22 GMT") (:CACHE-CONTROL . "no-cache, no-store") (:PRAGMA . "no-cache") (:CONTENT-TYPE . "application/json") (:SET-COOKIE . "ibapauth="client=AP..
7: ((LABELS DRAKMA::FINISH-REQUEST :IN DRAKMA:HTTP-REQUEST) NIL NIL)
8: (DRAKMA:HTTP-REQUEST #<PURI:URI https://cdsinfdnsgm.nnodns.com/wapi/v2.6/record:cname?_return_as_object=1> :COOKIE-JAR #<DRAKMA:COOKIE-JAR (with 0 cookies) {1004BE13F3}> :PROTOCOL :HTTP/1.1 :METHOD :G..
9: (DNS-ADMIN::DRAKMA-REQUEST "https://cdsinfdnsgm.nnodns.com/wapi/v2.6/record:cname" #<DRAKMA:COOKIE-JAR (with 0 cookies) {1004BE13F3}> :PROTOCOL :HTTP/1.1 :METHOD :GET :CONTENT-TYPE "application/x-www

It breaks because of this cookie:

set-cookie: ibapauth="client=API,group=VCP-Meter,ctime=1538755825,timeout=7200,mtime=1538755825,ip=10.134.10.30,auth=LOCAL,user=openbook,xbGCCBttvLHdRCahtYaAZVsixnZHaY57zMM"; httponly; Path=/; secure

Here's the output stream which shows where the breakage occurs. It looks like the comma is the point of failure, but the real issue is that the cookie parser doesn't know how to treat a quoted string as a single atomic value.

#<SB-IMPL::STRING-OUTPUT-STREAM {100831DC73}>

The object is a STRUCTURE-OBJECT of type SB-IMPL::STRING-OUTPUT-STREAM.
IN-BUFFER: NIL
CIN-BUFFER: NIL
IN-INDEX: 512
IN: #
BIN: #
N-BIN: #
OUT: #
BOUT: #
SOUT: #
MISC: #
INPUT-CHAR-POS: NIL
BUFFER: "client=API\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0"
PREV: NIL
NEXT: NIL
POINTER: 10
INDEX: 10
INDEX-CACHE: 0
ELEMENT-TYPE: CHARACTER

Cookies not supplied when following 302 Found

I am using Drakma to retrieve a resource; the request looks like this (with some obfuscation of personal details):

GET /MarkAsDownloaded.aspx?licenseid=01234567&enduserid=012345678&ts=20120714152835&mac=baadf00d HTTP/1.1\r\n
Request Method: GET
Request Version: HTTP/1.1
Host: bigpondmusic.com\r\n
User-Agent: Drakma/1.2.6 (SBCL 1.0.54; Linux; 3.2.0-23-generic; http://weitz.de/drakma/)\r\n
Cookie: BigpondMusic=SessionId=baadf00d; BigpondMusicRememberLogin=True; InitialLogIn=true\r\n

The server returns a 302 Found:

HTTP/1.1 302 Found\r\n
Request Version: HTTP/1.1
Status Code: 302
Response Phrase: Found
Location: http://admin.dams.bigpond.com/content.select?pid=baadf00d&UserName=bob%40example.com&titleprefix=37_-_Greg_Champion__Colin_Buchanan_-_&Key=baadf00d%3d\r\n

Drakma then requests the new URI:

GET /content.select?pid=baadf00d&UserName=bob%40example.com&titleprefix=37_-_Greg_Champion__Colin_Buchanan_-_&Key=baadf00d%3d HTTP/1.1\r\n
Request Method: GET
Request Version: HTTP/1.1
Host: admin.dams.bigpond.com\r\n
User-Agent: Drakma/1.2.6 (SBCL 1.0.54; Linux; 3.2.0-23-generic; http://weitz.de/drakma/)\r\n

... but as you can see, the cookie originally passed as part of the original request (courtesty the :cookie-jar option on drakma:http-request) doesn't get passed as part of the second request. This means that I receive an authentication failure from the server instead of the expected content.

I've tested this manually using my browser (Chromium 18.0.1025.151 Built on Ubuntu 12.04) and that works as expected; the request to admin.dams.bigpond.com is made with the cookie, and the content is retrieved.

All captures made with Wireshark. I'm installing Drakma through QuickLisp, and I'm running SBCL 1.0.54 on Linux Mint 13 Maya (GNU/Linux 3.2.0-23-generic x86_64). Hardware is a Lenovo ThinkPad L520 if that matters.

option :nodelay t to socket-connect brakes drakma on CLISP, ECL, CMUCL, SCL

:nodelay t option (together with :deadline option) to the socket-connect was introduces in the commit 9db1546, which was intended to support timeouts.

But :nodelay is not related to timeouts of read or write. It disables the Nagle algorightm - https://en.wikipedia.org/wiki/Nagle%27s_algorithm.

The :nodelay option is not supported on clisp, ecl, cmucl, scl. Any http-request on these lisps fails with error "UNSUPPORTED: NODELAY in SOCKET-CONNECT is unsupported."

Where supported, this option reduces performance of TCP - if we call write-byte three times, it sends three separate TCP packets, instead of combining them into one packet.

Therefore the suggestion: to remove the :nodelay option.

http-request silently converts encoded `%26' back to `&'

With both 1.2.6 and (without the preserve-uri parameter) on 1.2.5
for each of the following two cases the puri:uri value returned is
"http://id.loc.gov/vocabulary/graphicMaterials/label/Actions%20&%20defenses"
Where the requested URL 404s I would expect the puri:uri it to be
"http://id.loc.gov/vocabulary/graphicMaterials/label/Actions%20%26%20defenses"

(drakma:http-request "http://id.loc.gov/vocabulary/graphicMaterials/label/Actions%20%26%20defenses"
:redirect t
:auto-referer t)

(drakma:http-request "http://id.loc.gov/vocabulary/graphicMaterials/label/Actions%20%26%20defenses"
:preserve-uri t
:redirect t
:auto-referer t)

By comparison the following does what i expect (namely eats the 302 and follows the redirect):

(drakma:http-request "http://id.loc.gov/vocabulary/graphicMaterials/label/Abandoned%20buildings"
:preserve-uri t
:redirect t
:auto-referer t)

By comparions here are two curl invocations each of which performs as expected:

curl -I http://id.loc.gov/vocabulary/graphicMaterials/label/Actions%20%26%20defenses

curl -I http://id.loc.gov/vocabulary/graphicMaterials/label/Abandoned%20buildings

Ending a continuation style request results in a tcp input timeout, drakma side never sends FIN packet

pseudocode example of the problem
(let ((server-handle (http-request blah :content :continuation
:method :post :keep-alive t :close nil...))) ; since it is a continuation, are keep-alive and close valid? ostensibly you want to keep the connection alive but do these parameters have more meaning for :continuation?
...
(funcall server-handle new-data t) ;;t indicates that we have more data to send
...
;;finally
(funcall server-handle nil nil) ;;nil should indicate that we are done.

later there is an input timeout on the socket, and in wireshark I can see that there is no FIN packet sent, we just stop transmission.

Drakma throws error on SSL url which is handled fine by browser

I'm having a hard time understanding this error which results on accessing a given SSL url:

(drakma:http-request "https://health.edcast.org/learn/first-responder-trauma-emergency-care-program-fall-2014")

A failure in the SSL library occurred on handle #.(SB-SYS:INT-SAP #X7FFFEC01E560) (return code: 1). SSL error queue:
error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
[Condition of type CL+SSL::SSL-ERROR-SSL]

Can anyone tell me how to fix or work around this?

Problems with URL-ENCODE

The request:
(drakma:http-request "https://foobar.slack.com/api/search.all?query=users&query=\"password\"&token=foobar")

crashes PURI because of " character. This seems valid.

However, the request
(drakma:http-request "https://foobar.slack.com/api/search.all?query=users&query=%22password%22&token=foobar")

Is being transformed (probably by drakma:url-encode) to:
https://foobar.slack.com/api/search.all?query=users&query=%2522password%2522&token=foobar

I've workarounded this with url-encode modified to skip % character:

  "Returns a URL-encoded version of the string STRING using the
external format EXTERNAL-FORMAT. Skip #\%."
  (with-output-to-string (out)
    (loop for octet across (drakma::string-to-octets (or string "")
                                                     :external-format external-format)
          for char = (code-char octet)
          do (cond ((or (char<= #\0 char #\9)
                        (char<= #\a char #\z)
                        (char<= #\A char #\Z)
                        (find char "$-_.!*'(),%" :test #'char=))
                    (write-char char out))
                   ((char= char #\Space)
                    (write-char #\+ out))
                   (t (format out "%~2,'0x" (char-code char)))))))

But right now I have such a mess here with multiple levels of url-sanitization that I've got a feeling that I am doing something completely wrong.

So, the question.

How to send " in URI?

using #\DOUBLE_LOW-9_QUATATION_MARK in Url crashes Drakma

I encountered an error caused by non-Latin-1 characters used in a given url using sbcl e.g.:

(drakma:http-request "http://www.youtube.com/„weird-url")

debugger invoked on a FLEXI-STREAMS:EXTERNAL-FORMAT-ENCODING-ERROR       in thread
#<THREAD "initial thread" RUNNING {1002998D23}>:
#\DOUBLE_LOW-9_QUOTATION_MARK (code 8222) is not a LATIN-1 character.

Type HELP for debugger help, or (SB-EXT:QUIT) to exit from SBCL.

restarts (invokable by number or by possibly-abbreviated name):
  0: [ABORT] Exit debugger, returning to top level.

(FLEXI-STREAMS::SIGNAL-ENCODING-ERROR
 #<FLEXI-STREAMS::FLEXI-LATIN-1-FORMAT (:ISO-8859-1 :EOL-STYLE :LF)
   {1002F196E3}>
 "~S (code ~A) is not a LATIN-1 character."
 #\DOUBLE_LOW-9_QUOTATION_MARK
 8222)

I am using Drakma 1.3.0 // SBCL 1.0.55 // flexis-streams 1.07

As a quickfix I replaced all +latin-1+ in "request.lisp" with (make-external-format :utf-8) but I doubt that this is a valid solution, as the +latin-1+ was chosen for a reason?

Keep-Alive request hangs if made to an URL that returns 204 response

Here's a simple HTTP server in Python that returns 204 response for all requests:

#!/usr/bin/env python
from BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer

class S(BaseHTTPRequestHandler):
    def do_GET(self):
        self.send_response(204)
        self.end_headers()


def run(server_class=HTTPServer):
    server_address = ('', 4343)
    httpd = server_class(server_address, S)
    httpd.serve_forever()

if __name__ == "__main__":
    run()

When making a keep-alive request with drakma there the execution hangs forever. The non-keep-alive request works fine:

> (drakma:http-request
   "http://localhost:4343/"
   :method :GET)
#()
204
((:SERVER . "BaseHTTP/0.3 Python/2.7.15+")
 (:DATE . "Thu, 25 Oct 2018 15:44:38 GMT"))
#<PURI:URI http://localhost:4343/>
#<FLEXI-STREAMS:FLEXI-IO-STREAM {1051E270A3}>
T
"No Content"
;; OK


> (drakma:http-request 
   "http://localhost:4343/"
   :method :GET
   :keep-alive t :close nil)
;; hangs forever

Improve chunking behavior for small requests

Chunked encoding for request bodies is not supported by default by popular web frontends. DRAKMA should avoid chunked encoding if precomputing the complete request is possible.

SOCKS5 proxy support

Use case

Some systems, like the Tor network, expose their client functionality through a local SOCKS5 proxy.

Motivation

One of the cool functionalities of Tor is that it provides a mechanism to make servers publicly accessible through a random looking address, whose location is hidden. There is no need for a domain, and you can host the website in any machine running the Tor binary.

This servers, called Onion services, have two big advantages over normal web servers:

  • The identity of the user and the location of the server are hidden.
  • They are cheap to deploy and run, as there is no need to register for a domain.

Sending Files with Non-latin-1 Filenames Fails with Encoding Error Regardless of EXTERNAL-FORMAT-OUT

How to reproduce (the pathname does not have to exist on the system):

(drakma:http-request "http://example.com" :method :post :parameters '(("file" . #p"~/尻.txt")) :external-format-out :utf-8)

Expected behaviour: Sending the file with the filename encoded in utf-8
What happens instead:

Error:
#\U5C3B (code 23611) is not a LATIN-1 character.
   [Condition of type FLEXI-STREAMS:EXTERNAL-FORMAT-ENCODING-ERROR]

Restarts:
 0: [RETRY] Retry SLIME REPL evaluation request.
 1: [*ABORT] Return to SLIME's top level.
 2: [ABORT] Abort thread (#<THREAD "repl-thread" RUNNING {10053380B3}>)

Backtrace:
  0: (FLEXI-STREAMS::SIGNAL-ENCODING-ERROR #<FLEXI-STREAMS::FLEXI-LATIN-1-FORMAT (:ISO-8859-1 :EOL-STYLE :LF) {1003573FE3}> "~S (code ~A) is not a LATIN-1 character." #\U5C3B 23611)
  1: ((:METHOD FLEXI-STREAMS::WRITE-SEQUENCE* (FLEXI-STREAMS::FLEXI-LATIN-1-FORMAT T T T T)) #<unavailable argument> #<unavailable argument> #<unavailable argument> #<unavailable argument> #<unavailable ar..
  2: ((:METHOD TRIVIAL-GRAY-STREAMS:STREAM-WRITE-SEQUENCE (FLEXI-STREAMS:FLEXI-OUTPUT-STREAM T T T)) #<unavailable argument> #<unavailable argument> #<unavailable argument> #<unavailable argument>) [fast-m..
  3: ((SB-PCL::EMF TRIVIAL-GRAY-STREAMS:STREAM-WRITE-SEQUENCE) #<unused argument> #<unused argument> #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}> "尻.txt" 0 5)
  4: (SB-IMPL::%WRITE-STRING "尻.txt" #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}> 0 NIL)
  5: ((LABELS SB-IMPL::HANDLE-IT :IN SB-KERNEL:OUTPUT-OBJECT) #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}>)
  6: (PRINC "尻.txt" #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}>)
  7: ((LAMBDA (STREAM &OPTIONAL (#:FORMAT-ARG151 (ERROR (QUOTE SB-FORMAT:FORMAT-ERROR) :COMPLAINT "required argument missing" :CONTROL-STRING "; filename=\"~A\"" :OFFSET 13)) &REST SB-FORMAT::ARGS) :IN "/h..
  8: (FORMAT #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}> #<FUNCTION (LAMBDA (STREAM &OPTIONAL (#:FORMAT-ARG151 #) &REST SB-FORMAT::ARGS) :IN "/home/linus/quicklisp/dists/quicklisp/software/drakma-1.3.10/..
  9: ((LAMBDA (STREAM) :IN DRAKMA::MAKE-FORM-DATA-FUNCTION) #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}>)
 10: ((LAMBDA (STREAM) :IN DRAKMA::MAKE-FORM-DATA-FUNCTION) #<FLEXI-STREAMS:FLEXI-IO-STREAM {10087520B3}>) [external]
 11: ((LABELS DRAKMA::FINISH-REQUEST :IN DRAKMA:HTTP-REQUEST) #<CLOSURE (LAMBDA (STREAM) :IN DRAKMA::MAKE-FORM-DATA-FUNCTION) {100874CF1B}> NIL)
 12: (DRAKMA:HTTP-REQUEST #<PURI:URI http://example.com/> :METHOD :POST :PARAMETERS (("file" . #P"~/尻.txt")) :EXTERNAL-FORMAT-OUT :UTF-8)
 13: (SB-INT:SIMPLE-EVAL-IN-LEXENV (DRAKMA:HTTP-REQUEST "http://example.com" :METHOD :POST :PARAMETERS (QUOTE (#)) ...) #<NULL-LEXENV>)
 14: (EVAL (DRAKMA:HTTP-REQUEST "http://example.com" :METHOD :POST :PARAMETERS (QUOTE (#)) ...))
 --more--

It seems that the :external-format-out parameter is not taken into account for the filename header at all. Neither setting drakma:*drakma-default-external-format* or an implementation specific default encoding variable seems to have any effect.

Whitelist domains to not go via the proxy

Hi,

A feature request:

A special variable to hold a list of domains which should not go via the proxy. On some systems you want certain (non-local, usually) requests to go via the proxy and all others should stay local.

Would that be an agreeable addition?

incorrect POSIX return code handling on connect()

connect() can return error code -1 and set errno to EINTR, which doesn't mean the connection didn't succeed.

"If connect() is interrupted by a signal that is caught while blocked waiting to establish a connection, connect() shall fail and set errno to [EINTR], but the connection request shall not be aborted, and the connection shall be established asynchronously."
http://pubs.opengroup.org/onlinepubs/009695399/functions/connect.html

Currently, usocket and drakma fail with multiple threads trying to connect to different http endpoints and signal arrives and connect() returns EINTR which is far from desired. One way to work around is perform asynchronous connect from the beginning, but the usocket library doesn't support that .

:description

Would you please consider adding a :description option to your system definition of drakma?

puri is bringing drakma down

There is a latent bug in puri that is being brought to the fore by the
latest version of ASDF (3.3.0). Since drakma depends on puri, and
since the ASDF team has no interest in keeping ASDF compatible with
the bad behavior that puri exhibits, I thought you would be interested
in an early warning.

With the latest version of asdf (3.3.0) puri no longer compiles. The
following compiler error is thrown:

SET-DISPATCH-MACRO-CHARACTER would modify the standard readtable.
[Condition of type ASDF/FIND-SYSTEM:LOAD-SYSTEM-DEFINITION-ERROR]

In:

0: (SET-DISPATCH-MACRO-CHARACTER ## #\u # #<READTABLE {100002D6C3}>)

The code:

(defun sharp-u (stream chr arg)
(declare (ignore chr arg))
(let ((arg (read stream nil nil t)))
(if read-suppress
nil
(if* (stringp arg)
then (parse-uri arg)
else
(internal-reader-error
stream
"#u takes a string or list argument: ~s" arg)))))

(set-dispatch-macro-character ## #\u #'puri::sharp-u)

Backing up to the next most recent release of asdf makes the problem
go away:

dev-lisp/asdf-3.2.1-r1:0/3.2.1-r1
dev-lisp/uiop-3.2.1:0

http-request connection-timeout parameter

Hi Edi,
I just wondered why the documentation states that the parameter connection-timeout of http-request is the time drakma waits for a server to respond initially (that's how I read it). From the usocket documentation for socket-connect and its :timeout parameter over at https://common-lisp.net/project/usocket/api-docs.shtml this sets SO_RCVTIMEO which does have the effect of setting a timeout for connect() but it also determines the timeout for read() (but not for write()).

I'll be testing this tomorrow, since I need a timeout on drakma. Do you concur with my above description? May I submit a pull request to explain this a little more in the documentation?

http-request url-encoder documentation appears wrong.

The current documentation :

URL-ENCODER specifies a custom URL encoder function which will be used
by drakma to URL-encode parameter names and values. It needs to be a
function of _one_ argument. The argument is the string to encode, the
return value must be the URL-encoded string. This can be used if
specific encoding rules are required.

But it appears that the function requires two arguments. The default is a function of two arguments:

request.lisp

(defun http-request (uri &rest args
                         &key 
                              ...
                              (url-encoder #'url-encode)
                              ...

utils.lisp

(defun url-encode (string external-format)
 ...)

the actual funcall in utils.lisp uses two arguments:

(defun alist-to-url-encoded-string (alist external-format url-encoder)
  ...
     (format out "~A~:[~;=~A~]"
             (funcall url-encoder name external-format)
             value
             (funcall url-encoder value external-format)))))

Problem with unicode in URLs, e.g. http://pl.wiktionary.org/wiki/Wikis%c5%82ownik:Strona_g%c5%82%c3%b3wna

(let ((drakma:header-stream t)) (drakma:http-request "http://pl.wiktionary.org/wiki/Wikis%C5%82ownik:Strona_g%C5%82%C3%B3wna")))

The first printed line is:
GET /wiki/Wikis%25c5%2582ownik:Strona_g%25c5%2582%25c3%25b3wna HTTP/1.1

path is escaped twice, which is incorrect. I am using drakma-1.3.8 from quicklisp and newest puri from here: https://github.com/archimag/puri-unicode/blob/master/README Puri packaged for quicklisp does not handle unicode correctly.

My fix for drakma is to change request.lisp, line 634 from rendering newly created ad-hoc uri to manual rendering of path and query:

(write-http-line "~A ~A ~A"
                               (string-upcase method)
                               (if (and preserve-uri
                                        (stringp unparsed-uri))
                                   (trivial-uri-path unparsed-uri)
                                   (cond
                     ((and proxy
                       (null stream)
                       (not proxying-https-p)
                       (not real-host))
                      (puri:render-uri uri nil))
                     (t
                      (format nil "~a~a~a"
                          (puri::encode-escaped-encoding (or (puri:uri-path uri) "/")
                                         nil
                                         t)
                          (if (puri:uri-query uri) "?" "")
                          (if (puri:uri-query uri) 
                          (puri::encode-escaped-encoding (puri:uri-query uri) nil t)
                          "")))))
                   (string-upcase protocol))

Add hook to handle Content-Encoding

drakma should provide a hook that would make it easier to handle Content-Encoding's like gzip and deflate.

I would imagine such a hook would be a function that takes the headers, or content-encoding header, and the stream for the body, and returns a stream (either the same stream, or a decoded stream). This function would be called before setting the external format.

It would also be nice if drakma automatically handled a Content-Encoding of gzip or deflate, but that would add at least one additional dependency.

I will probably write a patch for this when I get a chance.

Would you consider exporting url-encode?

The url-encode function in Drakma turns out to be very useful in other projects, as are a number of things in util.lisp.

Would you ever consider exporting url-encode from the Drakma package or alternatively creating a new project with url-encode / decode that is used by Drakma?

I suppose I'm starting to think that a number of things like url encoding, query string parsing etc are generically useful for apps that interact with REST APIs via Drakma.

I guess I'm respectfully asking for your thoughts and wisdom please?!?

Thanks! RB

Use native SSL implementation

It would be nice if there were no dependencies on CFFI. CL-TLS might be a good place to start for a lisp implementation of SSL. It looks close to being complete enough to use for https.

drakma throws low level stream error when transfer interupted by host.

Throwning an error from the underlying stream library is problematic in situations where you're trying to create a reliable service from possibly unreliably HTTP hosts, i.e. reverse proxish type scenario where you made want to retry or select a different backend. Also it makes using drakma in a testing tool inconvenient.

Maybe something along the lines of the cookie error, where error generation can be optionally suppressed.

couldn't read from #<SB-SYS:FD-STREAM
                     for "socket 172.28.12.234:42164, peer: 172.26.27.23:8080"
                     {100A1492B3}>:
  Connection reset by peer
   [Condition of type SB-INT:SIMPLE-STREAM-ERROR]

Restarts:
 0: [RETRY] Retry SLIME REPL evaluation request.
 1: [*ABORT] Return to SLIME's top level.
 2: [ABORT] Abort thread (#<THREAD "new-repl-thread" RUNNING {10087C6063}>)

Backtrace:
  0: (SB-IMPL::SIMPLE-STREAM-PERROR "couldn't read from ~S" #<SB-SYS:FD-STREAM for "socket 172.28.12.234:42164, peer: 172.26.27.23:8080" {100A1492B3}> 104)
  1: (SB-IMPL::REFILL-INPUT-BUFFER #<SB-SYS:FD-STREAM for "socket 172.28.12.234:42164, peer: 172.26.27.23:8080" {100A1492B3}>)
  2: (SB-IMPL::INPUT-UNSIGNED-8BIT-BYTE #<SB-SYS:FD-STREAM for "socket 172.28.12.234:42164, peer: 172.26.27.23:8080" {100A1492B3}> NIL :EOF)
  3: (READ-BYTE #<SB-SYS:FD-STREAM for "socket 172.28.12.234:42164, peer: 172.26.27.23:8080" {100A1492B3}> NIL :EOF)
  4: (READ-BYTE #<CHUNGA:CHUNKED-IO-STREAM {100A1493E3}> NIL NIL)
  5: ((:METHOD FLEXI-STREAMS::READ-BYTE* (FLEXI-STREAMS:FLEXI-INPUT-STREAM)) #<unavailable argument>) [fast-method]
  6: ((:METHOD STREAM-READ-BYTE (FLEXI-STREAMS:FLEXI-INPUT-STREAM)) #<unavailable argument>) [fast-method]
  7: ((SB-PCL::EMF STREAM-READ-BYTE) #<unavailable argument> #<unavailable argument> #<FLEXI-STREAMS:FLEXI-IO-STREAM {100A14B523}>)
  8: (READ-BYTE #<FLEXI-STREAMS:FLEXI-IO-STREAM {100A14B523}> NIL NIL)
  9: (CHUNGA:READ-CHAR* #<FLEXI-STREAMS:FLEXI-IO-STREAM {100A14B523}> NIL NIL)
 10: (CHUNGA:READ-LINE* #<FLEXI-STREAMS:FLEXI-IO-STREAM {100A14B523}> #<READABLE-WRITABLE-STREAM {1004473D53}>)
 11: (DRAKMA::READ-STATUS-LINE #<FLEXI-STREAMS:FLEXI-IO-STREAM {100A14B523}> #<READABLE-WRITABLE-STREAM {1004473D53}>)
 12: ((LABELS DRAKMA::FINISH-REQUEST :IN DRAKMA:HTTP-REQUEST) NIL NIL)
 13: (DRAKMA:HTTP-REQUEST #<PURI:URI http://wac-public-m5-fe-2.internal.atlassian.com:8080/press> :ADDITIONAL-HEADERS (("Accept-Language" . "en")))
 14: (GET-MAGNOLIA-PUBLIC-BASE :SCHEME :HTTP :HTTP-HOST "wac-public-m5-fe-2.internal.atlassian.com" :PORT "8080" :LANG :EN :PATH "/press")
 15: (RUN-HTTP-PAGE-TEST ((:LANG . :EN) (:PATH . "/press") (:SCHEME . :HTTP) (:HTTP-HOST . "wac-public-m5-fe-2.internal.atlassian.com") (:PORT . "8080") (:TEST :LQUERY "title")))
 16: ((LAMBDA ()))
 17: (SB-INT:SIMPLE-EVAL-IN-LEXENV (LET ((RESULTS #)) (DOTIMES (I 1000) (PUSH # RESULTS)) RESULTS) #<NULL-LEXENV>)
 18: (SB-INT:SIMPLE-EVAL-IN-LEXENV (SB-INT:QUASIQUOTE (AND #S(SB-IMPL::COMMA :EXPR # :KIND 2))) #<NULL-LEXENV>)
 19: (SB-INT:SIMPLE-EVAL-IN-LEXENV (EVAL (SB-INT:QUASIQUOTE (AND #))) #<NULL-LEXENV>)
 --more--

Additional-headers no longer parse correctly

I think this is probably drakma related?

It seems that content type is broken?

(drakma-async:http-request "https://www.fqdn.com" :additional-headers '((:mine "test")))
GET / HTTP/1.1                                                                                                                                                                            
Host: www.fqdn.com                                                                                                                                                                      
User-Agent: Drakma/1.3.9 (SBCL 1.1.18; Linux; 3.14.4-x86_64-linode40; http://weitz.de/drakma/)                                                                                            
Accept: */*                                                                                                                                                                               
MINE: (test)  

Should be

GET / HTTP/1.1                                                                                                                                                                            
Host: www.fqdn.com                                                                                                                                                                      
User-Agent: Drakma/1.3.9 (SBCL 1.1.18; Linux; 3.14.4-x86_64-linode40; http://weitz.de/drakma/)                                                                                            
Accept: */*                                                                                                                                                                               
MINE: test

Do not use last element of uri path in the default cookie path value

In get-cookies in cookies.lisp, the :path for cookie is set to

(or (parameter-value "path" parameters)
   (puri:uri-path uri)
    "/")

This differs from RFC, https://tools.ietf.org/html/rfc6265#section-5.1.4, and practically, it breaks some applications (e.g., if cookie is set in /app/login, path is set to /app/login, not to /app, and rest of application does not receive it.)

Using (puri:uri-path (puri:merge-uris "." uri)) would help my use case, but I am not sure that this is in general exactly the same.

I can pull the one-line patch with this if wanted (docs and test cases as well?)

drakma:http-request does not terminate

I am using sbcl 1.1.14
drakma: 2.0.1
linux ubuntu 14.04 3.13.0-59-generic

(drakma:http-request "http://support.wnet.ua/lg.php"
:parameters (list (cons "query" "4")
(cons "arg" "8.8.8.8")
(cons "action" "Execute")
(cons "router" "hostkhar"))
:method :post)

will not terminate and a ctr-c + backtrace reveals:

0: ("bogus stack frame")
1: (SB-SYS:WAIT-UNTIL-FD-USABLE 7 :INPUT NIL NIL)
2: (SB-IMPL::REFILL-INPUT-BUFFER #<SB-SYS:FD-STREAM for "socket 192.168.0.13:39020, peer: 217.20.163.184:80" {1002C56803}>)
3: (SB-IMPL::INPUT-UNSIGNED-8BIT-BYTE #<SB-SYS:FD-STREAM for "socket 192.168.0.13:39020, peer: 217.20.163.184:80" {1002C56803}> T NIL)
4: (READ-BYTE #<SB-SYS:FD-STREAM for "socket 192.168.0.13:39020, peer: 217.20.163.184:80" {1002C56803}> T NIL)
5: (CHUNGA:READ-CHAR* #<SB-SYS:FD-STREAM for "socket 192.168.0.13:39020, peer: 217.20.163.184:80" {1002C56803}> T NIL)
6: ((:METHOD CHUNGA::FILL-BUFFER (CHUNGA:CHUNKED-INPUT-STREAM)) #<CHUNGA:CHUNKED-IO-STREAM {1002D826D3}>) [fast-method]
7: ((:METHOD TRIVIAL-GRAY-STREAMS:STREAM-READ-SEQUENCE (CHUNGA:CHUNKED-INPUT-STREAM T T T)) #<CHUNGA:CHUNKED-IO-STREAM {1002D826D3}> #(0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0...
AT SOME POINT DOWN HERE IS THE DRAKMA CALL

To all fairness, doing the same request via chromium does not terminate either. But still, I think it is a bit weird if I can effectively freeze the program permanently with an http-request as I'd expect it to either finish or time-out.

DRAKMA always uses chunked encoding to send request bodies of unknown length

Content posted with Drakma to nginx-backed servers is being rejected due to the lack of a Content-Length header. Here's an example of a GET request to nginx's website (hosted, naturally enough, using nginx). Everything goes according to plan:

CL-USER> (rest (multiple-value-list (drakma:http-request "http://nginx.org")))
(200
 ((:SERVER . "nginx/1.3.6") (:DATE . "Wed, 17 Oct 2012 08:13:50 GMT")
  (:CONTENT-TYPE . "text/html; charset=utf-8") (:CONTENT-LENGTH . "8561")
  (:LAST-MODIFIED . "Thu, 04 Oct 2012 18:25:08 GMT") (:CONNECTION . "close")
  (:ETAG . "\"506dd484-2171\"") (:ACCEPT-RANGES . "bytes"))
 #<PURI:URI http://nginx.org/> #<FLEXI-STREAMS:FLEXI-IO-STREAM {1007203AD3}> T
 "OK")

If we try a POST, however, our world begins to crumble:

CL-USER> (rest (multiple-value-list (drakma:http-request "http://nginx.org" :method :post :parameters '(("foo" . "bar")))))
(411
 ((:SERVER . "nginx/1.3.6") (:DATE . "Wed, 17 Oct 2012 08:16:56 GMT")
  (:CONTENT-TYPE . "text/html; charset=utf-8") (:CONTENT-LENGTH . "180")
  (:CONNECTION . "close"))
 #<PURI:URI http://nginx.org/> #<FLEXI-STREAMS:FLEXI-IO-STREAM {1009CF3F03}> T
 "Length Required")

The headers sent in the previous request are as follows:

POST / HTTP/1.1
Host: nginx.org
User-Agent: Drakma/1.2.8 (SBCL 1.0.57.56-2273f3a; Darwin; 11.4.0; http://weitz.de/drakma/)
Accept: */*
Connection: close
Content-Type: application/x-www-form-urlencoded
Transfer-Encoding: chunked

Notably absent is a Content-Length header. If it's included, our "Length Required" error goes away. It can be included by setting the value of the content-length variable in drakma's http-request definition at the same time the content variable's value is encoded from a parameters list. To fix the error in the simplest case (when no files are being uploaded), one need only add a single line after line 496 in the current version:

          (t
           (setq content (alist-to-url-encoded-string parameters external-format-out)
                 content-type "application/x-www-form-urlencoded")
           (setq content-length (length content)))  ;; NEW LINE HERE

Actually fixing the problem for the more general case will be more involved, I expect.

http-request when URI is a puri:uri -- more silent coercion problems

in lieu of your recent fix I've now noticed that when URI arg to http-request is a puri:uri we have a similar situation as when when URI is a string such that the following do not return equivalently:

(http-request #u"http://id.loc.gov/vocabulary/graphicMaterials/label/Action%20%26%20adventure%20dramas"
                     :preserve-uri t :method :head)

(drakma:http-request "http://id.loc.gov/vocabulary/graphicMaterials/label/Action%20%26%20adventure%20dramas"
                              :preserve-uri t :method :head)

It is beyond me whether this differences in return value constitutes a bug or not.

This said, I would like to point out that if it is considered a bug then the possible fixes around puri may not be quite so trivial as they were for strings esp. b/c puri:parse-uri breaks percent-encoded non-ASCII characters by silently coercing them to goo.

This returns:

(drakma:http-request "http://id.loc.gov/vocabulary/graphicMaterials/label/A%20la%20poup%C3%A9e%20prints"
                     :preserve-uri t 
                     :method :head)

These don't:

(drakma:http-request #u"http://id.loc.gov/vocabulary/graphicMaterials/label/A%20la%20poup%C3%A9e%20prints"
                     :preserve-uri t 
                     :method :head)

(drakma:http-request
 (puri:parse-uri "http://id.loc.gov/vocabulary/graphicMaterials/label/A%20la%20poup%C3%A9e%20prints")
 :preserve-uri t 
 :method :head)

additional discussion here:
http://paste.lisp.org/+2R44

Also, maybe these are relevant:
https://github.com/archimag/puri-unicode
https://github.com/franzinc/uri

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.