Le forcer a ecrire des fichiers format Windows et non Unix ? [WGET] - Logiciels - Windows & Software
Marsh Posté le 19-05-2003 à 19:13:54
Je n'ai pas de réponse directe a ta question mais une solution de contournement consistant a faire un script executant wget puis utilisant unix2dos pour transformer les retours chariot dans le fichier résultat.
Marsh Posté le 19-05-2003 à 19:50:19
Pink Floyd a écrit : |
C'est koi un faux retour chariot pour toi ???
jsuis que si tu me mail ton truc, ca marchera super bien chez moi ...
Marsh Posté le 24-05-2003 à 16:35:47
trictrac a écrit : |
ben me suis pas pris la tete j'ai fait un prog VB qui me fait la converstion..
EDIT : a merde le copie coller ameliore un chouilla les choses...
peut donc pas montrer...
m'enfin bref m'en suis sorti, merci quand meme !
Marsh Posté le 19-05-2003 à 18:17:29
Salut a tous voila j'utilise WGET pour recuperer une page HTML.
Mais il me le sauvegarde au format unix ( une seul ligne avec le retour chariot representer par un caractere ASCII)
Y a t'il un moyen de le forcer a faire un VRAI retour chariot ?
Thk...
EDIT : ( vois rien la dedans qui peu m'aider )
Startup:
-V, --version display the version of Wget and exit.
-h, --help print this help.
-b, --background go to background after startup.
-e, --execute=COMMAND execute a `.wgetrc'-style command.
Logging and input file:
-o, --output-file=FILE log messages to FILE.
-a, --append-output=FILE append messages to FILE.
-d, --debug print debug output.
-q, --quiet quiet (no output).
-v, --verbose be verbose (this is the default).
-nv, --non-verbose turn off verboseness, without being quiet.
-i, --input-file=FILE download URLs found in FILE.
-F, --force-html treat input file as HTML.
-B, --base=URL prepends URL to relative links in -F -i file.
--sslcertfile=FILE optional client certificate.
--sslcertkey=KEYFILE optional keyfile for this certificate.
--egd-file=FILE file name of the EGD socket.
Download:
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits).
-O --output-document=FILE write documents to FILE.
-nc, --no-clobber don't clobber existing files or use .# suffixes.
-c, --continue resume getting a partially-downloaded file.
--progress=TYPE select progress gauge type.
-N, --timestamping don't re-retrieve files unless newer than local.
-S, --server-response print server response.
--spider don't download anything.
-T, --timeout=SECONDS set the read timeout to SECONDS.
-w, --wait=SECONDS wait SECONDS between retrievals.
--waitretry=SECONDS wait 1...SECONDS between retries of a retrieval.
--random-wait wait from 0...2*WAIT secs between retrievals.
-Y, --proxy=on/off turn proxy on or off.
-Q, --quota=NUMBER set retrieval quota to NUMBER.
--limit-rate=RATE limit download rate to RATE.
Directories:
-nd --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.
HTTP options:
--http-user=USER set http user to USER.
--http-passwd=PASS set http password to PASS.
-C, --cache=on/off (dis)allow server-cached data (normally allowed).
-E, --html-extension save all text/html documents with .html extension.
--ignore-length ignore `Content-Length' header field.
--header=STRING insert STRING among the headers.
--proxy-user=USER set USER as proxy username.
--proxy-passwd=PASS set PASS as proxy password.
--referer=URL include `Referer: URL' header in HTTP request.
-s, --save-headers save the HTTP headers to file.
-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION.
--no-http-keep-alive disable HTTP keep-alive (persistent connections).
--cookies=off don't use cookies.
--load-cookies=FILE load cookies from FILE before session.
--save-cookies=FILE save cookies to FILE after session.
FTP options:
-nr, --dont-remove-listing don't remove `.listing' files.
-g, --glob=on/off turn file name globbing on or off.
--passive-ftp use the "passive" transfer mode.
--retr-symlinks when recursing, get linked-to files (not dirs).
Recursive retrieval:
-r, --recursive recursive web-suck -- use with care!
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after delete files locally after downloading them.
-k, --convert-links convert non-relative links to relative.
-K, --backup-converted before converting file X, back up as X.orig.
-m, --mirror shortcut option equivalent to -r -N -l inf -nr.
-p, --page-requisites get all images, etc. needed to display HTML page.
Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions.
-R, --reject=LIST comma-separated list of rejected extensions.
-D, --domains=LIST comma-separated list of accepted domains.
--exclude-domains=LIST comma-separated list of rejected domains.
--follow-ftp follow FTP links from HTML documents.
--follow-tags=LIST comma-separated list of followed HTML tags.
-G, --ignore-tags=LIST comma-separated list of ignored HTML tags.
-H, --span-hosts go to foreign hosts when recursive.
-L, --relative follow relative links only.
-I, --include-directories=LIST list of allowed directories.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent don't ascend to the parent directory.
Message édité par pink floyd le 19-05-2003 à 18:20:23
---------------
Oui aux titres de topic clair et precis...