#244 closed enhancement (fixed)
mindi 2.0.0 broken for tape support called from mondo
Reported by: | Bruno Cornec | Owned by: | Bruno Cornec |
---|---|---|---|
Priority: | highest | Milestone: | 2.2.6 |
Component: | mindi | Version: | 2.2.5 |
Severity: | blocker | Keywords: | tags$ |
Cc: |
Description
Error reported in logs:
Max kernel size on 16384 KB image (est'd) = 13929 K cp: cannot stat `/u01/mondo.scratch.27808/mondo.scratch.4815/images/*.img': No such file or directory cp: cannot stat `/u01/mondo.scratch.27808/mondo.scratch.4815/images/*.gz': No such file or directory OfferToMakeBootableISO: Cannot copy to /u01/mondo.tmp.IrU4Jk/iso/images mindi_lib = /usr/lib/mindi Created bootable ISO image at /u01/mondo.scratch.27808/mondo.scratch.4815/images/mindi.iso FATAL ERROR. Cannot find all.tar.gz, to be written to tape
Change History (7)
comment:1 by , 17 years ago
Status: | new → assigned |
---|
comment:2 by , 17 years ago
text mode as well as newt mode exhibit the issue. Seems to be linked to the usage of read_file_from_stream_to_stream, used only here, which calls read_file_from_stream_FULL, which calls fwrite and receive SIGPIPE. after 3 slices processed (around)
comment:3 by , 17 years ago
It appears that this problem occurs when using tapes and gzip as a compressor (-G option of mondoarchive). A workaround is thus to use bzip2 as a compressor in the mean time. I've been able to successfully backup and restore a RHEL4 system with USB tapes and bzip2 (no -G option) without issue.
comment:6 by , 17 years ago
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
comment:7 by , 15 years ago
Keywords: | tags$ added |
---|---|
Type: | defect → enhancement |
I had got a desire to start my own organization, however I did not have enough amount of money to do it. Thank God my close dude said to use the <a href="http://bestfinance-blog.com/topics/home-loans">home loans</a>. Thence I took the student loan and made real my old dream.
First fix needed in rev [1898] Now during restore mondorestore crashes when restoring big files