FAQ Search Today's Posts Mark Forums Read
» Video Reviews

» Linux Archive

Linux-archive is a website aiming to archive linux email lists and to make them easily accessible for linux users/developers.


» Sponsor

» Partners

» Sponsor

Go Back   Linux Archive > Redhat > Fedora User

 
 
LinkBack Thread Tools
 
Old 09-26-2012, 01:45 AM
Dave Stevens
 
Default how to use the wget command to copy whole website to local

Quoting yujian <yujian4newsgroup@gmail.com>:


I want to copy a website to my local disk. I use the command wget -r
www.example.com, but I find that only html copyed.


what else were you expecting to be copied? And have you read the man
page? Or maybe an on-line tutorial?


Dave


--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org





--
If all the advertising in the world were to shut down tomorrow, would people
still go on buying more soap, eating more apples, giving their children more
vitamins, roughage, milk, olive oil, scooters and laxatives, learning more
languages by iPod, hearing more virtuosos by radio, re-decorating their
houses, refreshing themselves with more non-alcoholic thirst-quenchers,
cooking more new, appetizing dishes, affording themselves that little extra
touch which means so much? Or would the whole desperate whirligig slow
down, and the exhausted public relapse upon plain grub and elbow-grease?

--- Dorothy L Sayers, in Murder Must Advertise


--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 01:51 AM
yujian
 
Default how to use the wget command to copy whole website to local

于 2012/9/26 9:45, Dave Stevens 写道:

Quoting yujian <yujian4newsgroup@gmail.com>:


I want to copy a website to my local disk. I use the command wget -r
www.example.com, but I find that only html copyed.


what else were you expecting to be copied? And have you read the man
page? Or maybe an on-line tutorial?


Dave


--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org




All the files in the website, such as pdf, doc and exe file. I saw
the man page, so that I use wget -r to try to download it.

--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 02:06 AM
Dave Stevens
 
Default how to use the wget command to copy whole website to local

Quoting yujian <yujian4newsgroup@gmail.com>:


于 2012/9/26 9:45, Dave Stevens 写道:

Quoting yujian <yujian4newsgroup@gmail.com>:


I want to copy a website to my local disk. I use the command wget -r
www.example.com, but I find that only html copyed.


what else were you expecting to be copied? And have you read the
man page? Or maybe an on-line tutorial?


Dave


--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org




All the files in the website, such as pdf, doc and exe file. I
saw the man page, so that I use wget -r to try to download it.


maybe it would help if you tell us the command line you used.

D







--
If all the advertising in the world were to shut down tomorrow, would people
still go on buying more soap, eating more apples, giving their children more
vitamins, roughage, milk, olive oil, scooters and laxatives, learning more
languages by iPod, hearing more virtuosos by radio, re-decorating their
houses, refreshing themselves with more non-alcoholic thirst-quenchers,
cooking more new, appetizing dishes, affording themselves that little extra
touch which means so much? Or would the whole desperate whirligig slow
down, and the exhausted public relapse upon plain grub and elbow-grease?

--- Dorothy L Sayers, in Murder Must Advertise

--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 02:24 AM
"Mark C. Allman"
 
Default how to use the wget command to copy whole website to local

On Tue, 2012-09-25 at 19:06 -0700, Dave Stevens wrote:
> Quoting yujian <yujian4newsgroup@gmail.com>:
>
> > 于 2012/9/26 9:45, Dave Stevens 写道:
> >> Quoting yujian <yujian4newsgroup@gmail.com>:
> >>
> >>> I want to copy a website to my local disk. I use the command wget -r
> >>> www.example.com, but I find that only html copyed.
> >>
> >> what else were you expecting to be copied? And have you read the
> >> man page? Or maybe an on-line tutorial?
> >>
> >> Dave
> >>
> >>> --
> >>> users mailing list
> >>> users@lists.fedoraproject.org
> >>> To unsubscribe or change subscription options:
> >>> https://admin.fedoraproject.org/mailman/listinfo/users
> >>> Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
> >>> Have a question? Ask away: http://ask.fedoraproject.org
> >>>
> >>
> >>
> >> All the files in the website, such as pdf, doc and exe file. I
> >> saw the man page, so that I use wget -r to try to download it.
>
> maybe it would help if you tell us the command line you used.
>
> D

Looks to me like the the command the OP used was:
wget -r <url>

I understand the original question but I've never tried to download a
complete site. The "-m" and "-p" switches look interesting. I'd first
take a few minutes to work through the "man" page. Looks like lots of
good documentation there.

--
Mark C. Allman, PMP, CSM
Founder, See How You Ski
Allman Professional Consulting, Inc., www.allmanpc.com
617-947-4263, Twitter: @allmanpc


--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 02:33 AM
Waldemar Villamayor-Venialbo
 
Default how to use the wget command to copy whole website to local

Hi,

Try this:

wget --recursive --page-requisites --convert-links --no-parent
--domains ocw.mit.edu --no-check-certificate --continue
"http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/"

it always worked for me, just replacethe <domain> and the <url>

Best regards

Waldemar
--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 04:06 AM
Frank Cox
 
Default how to use the wget command to copy whole website to local

On Wed, 26 Sep 2012 09:18:31 +0800
yujian wrote:

> I want to copy a website to my local disk. I use the command wget -r
> www.example.com, but I find that only html copyed.

httrack

--
MELVILLE THEATRE ~ Real D 3D Digital Cinema ~ www.melvilletheatre.com
www.creekfm.com - FIFTY THOUSAND WATTS of POW WOW POWER!
--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 12:07 PM
Steven Stern
 
Default how to use the wget command to copy whole website to local

On 09/25/2012 11:06 PM, Frank Cox wrote:
> On Wed, 26 Sep 2012 09:18:31 +0800
> yujian wrote:
>
>> I want to copy a website to my local disk. I use the command wget -r
>> www.example.com, but I find that only html copyed.
>
> httrack
>
The appropriate command is

wget --mirror url

but I find it doesn't work if the server does page compression. I get
only the home page.

--
-- Steve
--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 
Old 09-26-2012, 01:06 PM
yujian
 
Default how to use the wget command to copy whole website to local

On Wed, Sep 26, 2012 at 8:07 PM, Steven Stern <subscribed-lists@sterndata.com> wrote:

On 09/25/2012 11:06 PM, Frank Cox wrote:

> On Wed, 26 Sep 2012 09:18:31 +0800

> yujian wrote:

>

>> I want to copy a website to my local disk. I use the command wget -r

>> www.example.com, but I find that only html copyed.

>

> httrack

>

The appropriate command is



* wget --mirror url



but I find it doesn't work if the server does page compression. *I get

only the home page.

A good answer. Thank you very much.*


--

-- Steve

--

users mailing list

users@lists.fedoraproject.org

To unsubscribe or change subscription options:

https://admin.fedoraproject.org/mailman/listinfo/users

Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines

Have a question? Ask away: http://ask.fedoraproject.org



--
users mailing list
users@lists.fedoraproject.org
To unsubscribe or change subscription options:
https://admin.fedoraproject.org/mailman/listinfo/users
Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
Have a question? Ask away: http://ask.fedoraproject.org
 

Thread Tools




All times are GMT. The time now is 12:52 AM.

VBulletin, Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO ©2007, Crawlability, Inc.
Copyright 2007 - 2008, www.linux-archive.org