Linux Archive

Linux Archive (http://www.linux-archive.org/)
-   Debian User (http://www.linux-archive.org/debian-user/)
-   -   problems with tar for backup (maximum tar file size?) (http://www.linux-archive.org/debian-user/97856-problems-tar-backup-maximum-tar-file-size.html)

"Jimmy Wu" 05-30-2008 02:46 AM

problems with tar for backup (maximum tar file size?)
 
I haven't been backing up any of my stuff, and yesterday I decided to
start doing that
I want to use tar with bz2, and I wrote this little script to
hopefully automate this process (attached)
The script works, but tar doesn't. The logs show no errors until
somewhere near the end, when it says
tar: Error exit delayed from previous errors
but no other errors.

I've been searching online, and the only thing I can think of that's
wrong is the directory is too big. From what I read, the way tar
works, the tar archive can't be bigger than 8GB. My home directory is
about that much, maybe a little more. The largest file I have is a 2+
GB dvd iso.

So I was wondering: (1) Is it true that tar files can't be bigger than
8GB, and (2) If so, what should I use to backup directories bigger
than 8GB? I wanted to stick with tar because I can open those on
other platforms. If directory size isn't the problem, then what could
be going on?

Thanks!

--
Jimmy Wu
Registered Linux User #454138
() ascii ribbon campaign - against html e-mail
/ www.asciiribbon.org - against proprietary attachments

Wackojacko 05-30-2008 09:03 AM

problems with tar for backup (maximum tar file size?)
 
Jimmy Wu wrote:

I haven't been backing up any of my stuff, and yesterday I decided to
start doing that
I want to use tar with bz2, and I wrote this little script to
hopefully automate this process (attached)
The script works, but tar doesn't. The logs show no errors until
somewhere near the end, when it says
tar: Error exit delayed from previous errors
but no other errors.

I've been searching online, and the only thing I can think of that's
wrong is the directory is too big. From what I read, the way tar
works, the tar archive can't be bigger than 8GB. My home directory is
about that much, maybe a little more. The largest file I have is a 2+
GB dvd iso.

So I was wondering: (1) Is it true that tar files can't be bigger than
8GB, and (2) If so, what should I use to backup directories bigger
than 8GB? I wanted to stick with tar because I can open those on
other platforms. If directory size isn't the problem, then what could
be going on?

Thanks!

--
Jimmy Wu
Registered Linux User #454138
() ascii ribbon campaign - against html e-mail
/ www.asciiribbon.org - against proprietary attachments


I use backup2l and it makes backups using tar and bz2. My initial
backup is about 16G.


#ll /mnt/backup/backup/backup.1.tar.bz2
#-rw-r--r-- 1 root root 17991855331 2008-05-05 15:22
/mnt/backup/backup/backup.1.tar.bz2


Maybe a filesystem restriction in the same way vfat is restricted to 4G?

HTH

Wackojacko


--
To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org

"Douglas A. Tutty" 05-31-2008 02:40 AM

problems with tar for backup (maximum tar file size?)
 
On Thu, May 29, 2008 at 10:46:10PM -0400, Jimmy Wu wrote:
> I haven't been backing up any of my stuff, and yesterday I decided to
> start doing that
> I want to use tar with bz2, and I wrote this little script to
> hopefully automate this process (attached)
> The script works, but tar doesn't. The logs show no errors until
> somewhere near the end, when it says
> tar: Error exit delayed from previous errors
> but no other errors.
>
> I've been searching online, and the only thing I can think of that's
> wrong is the directory is too big. From what I read, the way tar
> works, the tar archive can't be bigger than 8GB. My home directory is
> about that much, maybe a little more. The largest file I have is a 2+
> GB dvd iso.
>
> So I was wondering: (1) Is it true that tar files can't be bigger than
> 8GB, and (2) If so, what should I use to backup directories bigger
> than 8GB? I wanted to stick with tar because I can open those on
> other platforms. If directory size isn't the problem, then what could
> be going on?

Read the tar info docs (tar-doc is contrib or non-free, I forget which,
and is in info format so you either need info or pinfo to read it). The
maximum archive size depends on the archive format you are creating,
however neither gnu or posix format has this limitation so this
shouldn't be a problem.

I get that error if I have a hanging symlink (a symlink that points to a
non-existant file) which I am telling tar to backup. Check in the
directories which you are backing up for such links. However, even with
the error, I get a functional tarball at the end.

What happens if you add the -v verbose option? Do you see more detail?

Doug.


--
To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org

Mike McCarty 08-26-2008 04:20 PM

problems with tar for backup (maximum tar file size?)
 
Jimmy Wu wrote:

[...]


So I was wondering: (1) Is it true that tar files can't be bigger than
8GB, and (2) If so, what should I use to backup directories bigger
than 8GB? I wanted to stick with tar because I can open those on
other platforms. If directory size isn't the problem, then what could
be going on?


I use tar to do backups, and have generated tarballs which span 26
DVDs (roughly 118 Gig). The problem is likely not tar, but the file
system. Tar does not have a central directory, it has a distributed
directory. Each file in the archive has a little header which describes
it. Another thing to watch for, if you are putting files onto DVDs,
as opposed to writing them raw, is that the largest single file which
the ISO file system (not UDF) for DVDs can support is just under
2 Gig. So you can't write a single huge 4.x Gig file to a DVD using
that file system. Here's a little piece of the script I use for
creating chunks which fit, four to a DVD.
[-----------------------------------------------]
#!/bin/bash


# Create backups of /etc, /home, /usr/local, and...
PATH=/bin:/usr/bin


DEST_DIR=/var/backups


#omitteddirs = "/srv /var/backups /var/games /mnt/usb/home/jmccarty/images
# /home/jmccarty/linux_versions"
# removed from list...


backupdirs="/boot /etc
[...]
/var/mail"

echo "System backup beginning" | wall


cd $DEST_DIR
mv -f backup.lst backupsave.lst

# 4482m works, but 2% left for rim damage
# This splits it up into pieces which can be written, four at a time, to
# a DVD.
tar c -O $backupdirs | split -d -b 1100m - backup.tar.
# | || || || ||||| | |||||||||||
# | || || || ||||| | +++++++++++-- prefix to use
# | || || || ||||| +-------------- use standard input
# | || || || +++++---------------- size in
1024*1024 bytes# | || || ++----------------------
size to use
# | || ++------------------------- use numeric
suffices

# | ||
# | ++-- to standard output
# +------ create


echo "System backups complete, status: $?" | wall


echo "Now verifying system backups" | wall
cat backup.tar.* | tar tv 1> backup.lst &&
echo "Verified, please burn backup /mnt/var/backups/backup.tar
to DVDRO$ | wall ||

echo "BACKUP FAILED" | wall
[--------------------------------------------------------]

HTH. If the formatting gets lost, I can also send an attachment.

Mike
--
p="p=%c%s%c;main(){printf(p,34,p,34);}";main(){pri ntf(p,34,p,34);}
Oppose globalization and One World Governments like the UN.
This message made from 100% recycled bits.
You have found the bank of Larn.
I speak only for myself, and I am unanimous in that!


--
To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org

"Jimmy Wu" 08-26-2008 06:03 PM

problems with tar for backup (maximum tar file size?)
 
On Tue, Aug 26, 2008 at 12:20, Mike McCarty <Mike.McCarty@sbcglobal.net> wrote:
> Jimmy Wu wrote:
>
> [...]
>
>> So I was wondering: (1) Is it true that tar files can't be bigger than
>> 8GB, and (2) If so, what should I use to backup directories bigger
>> than 8GB? I wanted to stick with tar because I can open those on
>> other platforms. If directory size isn't the problem, then what could
>> be going on?
>
> I use tar to do backups, and have generated tarballs which span 26
> DVDs (roughly 118 Gig). The problem is likely not tar, but the file
> system. Tar does not have a central directory, it has a distributed
> directory. Each file in the archive has a little header which describes
> it. Another thing to watch for, if you are putting files onto DVDs,
> as opposed to writing them raw, is that the largest single file which
> the ISO file system (not UDF) for DVDs can support is just under
> 2 Gig. So you can't write a single huge 4.x Gig file to a DVD using
> that file system. Here's a little piece of the script I use for
> creating chunks which fit, four to a DVD.

Thanks for that - I've been using rsync and an external HD to do
incremental backups, but I'll save your message for future reference.

Jimmy


--
To UNSUBSCRIBE, email to debian-user-REQUEST@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org


All times are GMT. The time now is 04:27 AM.

VBulletin, Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO ©2007, Crawlability, Inc.