Artificial intelligent assistant

wget duplicate files I'm executing the command: > wget -b --no-clobber -nc -w 0 -Q 0 -r -E -e robots=off -U mozilla -t 0 --no-dns-cache -4 -R gif,jpeg,tif,jpg,pdf,bmp,png,css,js < Problem: If I re-run the command (when necessary if wget abruptly stops) I get duplicates for [some] files, such as example.html and example.1.html. Oddly enough a .2.html .3.html is never created. Any idea how to prevent this?

Try with the `-c` option:


-c
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:

wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z

If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.


So, something like (I removed `--no-clobber`, that's what `-nc` means):


wget -b -c -nc -w 0 -Q 0 -r -E -e robots=off -U mozilla -t 0 --no-dns-cache -4 -R gif,jpeg,tif,jpg,pdf,bmp,png,css,js

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 998a233876a94ccc3ecebd8fb9112dad