* doc/misc/url.texi (Retrieving URLs): Update url-retrieve arguments.
Mention url-queue-retrieve.
* lisp/url/url-queue.el (url-queue-retrieve): Doc fix.
* etc/NEWS: Edits.
+2012-02-10 Glenn Morris <rgm@gnu.org>
+
+ * url.texi (Retrieving URLs): Update url-retrieve arguments.
+ Mention url-queue-retrieve.
+
2012-02-09 Glenn Morris <rgm@gnu.org>
* sem-user.texi (Semantic mode user commands): Typo fix.
info, or mailto URLs that need no further processing).
@end defun
-@defun url-retrieve url callback &optional cbargs
+@defun url-retrieve url callback &optional cbargs silent no-cookies
Retrieve @var{url} asynchronously and call @var{callback} with args
@var{cbargs} when finished. The callback is called when the object
has been completely retrieved, with the current buffer containing the
object and any MIME headers associated with it. @var{url} is either a
string or a parsed URL structure. Returns the buffer @var{url} will
load into, or @code{nil} if the process has already completed.
+If the optional argument @var{silent} is non-@code{nil}, suppress
+progress messages. If the optional argument @var{no-cookies} is
+non-@code{nil}, do not store or send cookies.
+@end defun
+
+@vindex url-queue-parallel-processes
+@vindex url-queue-timeout
+@defun url-queue-retrieve url callback &optional cbargs silent no-cookies
+This acts like the @code{url-retrieve} function, but downloads in
+parallel. The option @code{url-queue-parallel-processes} controls the
+number of concurrent processes, and the option @code{url-queue-timeout}
+sets a timeout in seconds.
@end defun
@node Supported URL Types
*** The option `ange-ftp-binary-file-name-regexp' has changed its
default value to "".
-** `url-queue-retrieve' downloads web pages asynchronously, but allow
-controlling the degree of parallelism.
++++
+** New function, url-queue-retrieve, fetches URLs asynchronously like
+url-retrieve does, but in parallel.
** VC and related modes
---
*** pc-mode.el is obsolete (CUA mode is much more comprehensive).
+[gnus.texi, message.texi need updating]
*** pgg is obsolete (use EasyPG instead)
---
+2012-02-10 Glenn Morris <rgm@gnu.org>
+
+ * url-queue.el (url-queue-retrieve): Doc fix.
+
2012-02-08 Lars Ingebrigtsen <larsi@gnus.org>
* url-parse.el (url): Add the `use-cookies' slot to the URL struct
;;;###autoload
(defun url-queue-retrieve (url callback &optional cbargs silent inhibit-cookies)
"Retrieve URL asynchronously and call CALLBACK with CBARGS when finished.
-Like `url-retrieve' (which see for details of the arguments), but
-controls the level of parallelism via the
-`url-queue-parallel-processes' variable."
+This is like `url-retrieve' (which see for details of the arguments),
+but downloads in parallel. The variable `url-queue-parallel-processes'
+sets the number of concurrent processes. The variable `url-queue-timeout'
+sets a timeout."
(setq url-queue
(append url-queue
(list (make-url-queue :url url
(push job jobs)))
(dolist (job jobs)
(setq url-queue (delq job url-queue)))))
-
+
(defun url-queue-start-retrieve (job)
(setf (url-queue-buffer job)
(ignore-errors