wget <command...>

A non-interactive network retriever

Arguments

NameDescription
commandThe url(s) to retrieve

Options

NameDescription
-V,--versionDisplay the version of Wget and exit
-h,--helpPrint this help
-b,--backgroundGo to background after startup
-e,--execute=COMMANDExecute a `.wgetrc'-style command
-o,--output-file=FILELog messages to FILE
-a,--append-output=FILEAppend messages to FILE
-q,--quietQuiet (no output)
-v,--verboseBe verbose (this is the default)
-nv,--no-verboseTurn off verboseness, without being quiet
--report-speed=TYPEOutput bandwidth as TYPE. TYPE can be bits
-i,--input-file=FILEDownload URLs found in local or external FILE
-F,--force-htmlTreat input file as HTML
-B,--base=URLResolves HTML input-file links (-i -F) relative to URL
--config=FILESpecify config file to use
--no-configDo not read any config file
--rejected-log=FILELog reasons for URL rejection to FILE
-t,--tries=NUMBERSet number of retries to NUMBER (0 unlimits)
--retry-connrefusedRetry even if connection is refused
--retry-on-http-errorComma-separated list of HTTP errors to retry
-O,--output-document=FILEWrite documents to FILE
-nc,--no-clobberSkip downloads that would download to existing files (overwriting them)
--no-netrcDon't try to obtain credentials from .netrc
-c,--continueResume getting a partially-downloaded file
--start-pos=OFFSETStart downloading from zero-based position OFFSET
--progress=TYPESelect progress gauge type
--show-progressDisplay the progress bar in any verbosity mode
-N,--timestampingDon't re-retrieve files unless newer than local
-S,--server-responsePrint server response
--spiderDon't download anything
-T,--timeout=SECONDSSet all timeout values to SECONDS
--dns-timeout=SECSSet the DNS lookup timeout to SECS
--connect-timeout=SECSSet the connect timeout to SECS
--read-timeout=SECSSet the read timeout to SECS
-w,--wait=SECONDSWait SECONDS between retrievals
--waitretry=SECONDSWait 1..SECONDS between retries of a retrieval
--random-waitWait from 0.5*WAIT...1.5*WAIT secs between retrievals
--no-proxyExplicitly turn off proxy
-Q,--quota=NUMBERSet retrieval quota to NUMBER
--bind-address=ADDRESSBind to ADDRESS (hostname or IP) on local host
--limit-rate=RATELimit download rate to RATE
--no-dns-cacheDisable caching DNS lookups
--restrict-file-names=OSRestrict chars in file names to ones OS allows
--ignore-caseIgnore case when matching files/directories
-4,--inet4-onlyConnect only to IPv4 addresses
-6,--inet6-onlyConnect only to IPv6 addresses
--user=USERSet both ftp and http user to USER
--password=PASSSet both ftp and http password to PASS
--ask-passwordPrompt for passwords
--no-iriTurn off IRI support
--local-encoding=ENCUse ENC as the local encoding for IRIs
--remote-encoding=ENCUse ENC as the default remote encoding
--unlinkRemove file before clobber
--xattrTurn on storage of metadata in extended file attributes
-nd,--no-directoriesDon't create directories
-x,--force-directoriesForce creation of directories
-nH,--no-host-directoriesDon't create host directories
--protocol-directoriesUse protocol name in directories
-P,--directory-prefix=PREFIXSave files to PREFIX/
--cut-dirs=NUMBERIgnore NUMBER remote directory components
--http-user=USERSet http user to USER
--http-password=PASSSet http password to PASS
--no-cacheDisallow server-cached data
-E,--adjust-extensionSave HTML/CSS documents with proper extensions
--ignore-lengthIgnore 'Content-Length' header field
--header=STRINGInsert STRING among the headers
--compression=TYPEChoose compression, one of auto, gzip and none. (default: none)
--max-redirectMaximum redirections allowed per page
--proxy-user=USERSet USER as proxy username
--proxy-password=PASSSet PASS as proxy password
--referer=URLInclude 'Referer: URL' header in HTTP request
--save-headersSave the HTTP headers to file
-U,--user-agent=AGENTIdentify as AGENT instead of Wget/VERSION
--no-http-keep-aliveDisable HTTP keep-alive (persistent connections)
--no-cookiesDon't use cookies
--load-cookies=FILELoad cookies from FILE before session
--save-cookies=FILESave cookies to FILE after session
--keep-session-cookiesLoad and save session (non-permanent) cookies
--post-data=STRINGUse the POST method; send STRING as the data
--post-file=FILEUse the POST method; send contents of FILE
--method=HTTPMethodUse method "HTTPMethod" in the request
--body-data=STRINGSend STRING as data. --method MUST be set
--body-file=FILESend contents of FILE. --method MUST be set
--content-on-errorOutput the received content on server errors
--secure-protocol=PRChoose secure protocol, one of auto, SSLv2,
--https-onlyOnly follow secure HTTPS links
--no-check-certificateDon't validate the server's certificate
--certificate=FILEClient certificate file
--certificate-type=TYPEClient certificate type, PEM or DER
--private-key=FILEPrivate key file
--private-key-type=TYPEPrivate key type, PEM or DER
--ca-certificate=FILEFile with the bundle of CAs
--ca-directory=DIRDirectory where hash list of CAs is stored
--crl-file=FILEFile with bundle of CRLs
--ciphers=STRSet the priority string (GnuTLS) or cipher list string (OpenSSL) directly
-r,--recursiveSpecify recursive download
-l,--level=NUMBERMaximum recursion depth (inf or 0 for infinite)
--delete-afterDelete files locally after downloading them
-k,--convert-linksMake links in downloaded HTML or CSS point to local files
-K,--backup-convertedBefore converting file X, back up as X.orig
-m,--mirrorShortcut for -N -r -l inf --no-remove-listing
-p,--page-requisitesGet all images, etc. needed to display HTML page
-A,--accept=LISTComma-separated list of accepted extensions
-R,--reject=LISTComma-separated list of rejected extensions
--accept-regex=REGEXRegex matching accepted URLs
--reject-regex=REGEXRegex matching rejected URLs
--regex-type=TYPERegex type (posix)
-D,--domains=LISTComma-separated list of accepted domains
--exclude-domains=LISTComma-separated list of rejected domains
--follow-ftpFollow FTP links from HTML documents
--follow-tags=LISTComma-separated list of followed HTML tags
--ignore-tags=LISTComma-separated list of ignored HTML tags
-H,--span-hostsGo to foreign hosts when recursive
-L,--relativeFollow relative links only
-I,--include-directories=LISTList of allowed directories
-X,--exclude-directories=LISTList of excluded directories
-np,--no-parentDon't ascend to the parent directory