View Full Version : Worst compression for a download?

25th January 2007, 14:00
What is the worst compression setup you have encountered in a download?

For me it was just now.

1, The download contained a single zip file.
2, Normally that means the ISO is inside of it. But no! I opened it to discover a numbered set of 73 smaller zip files inside.
3, I extracted this series of zip files, but when I extracting this set I ended up with another numbered set of rar files. Ahhh!
4, So I extracted those and ended up with another single big rar file. :unsure: hmm...
5, Finally I extracted that and finally ended up with an ISO.

The only question is why? So many conpressed files inside compressed files. If just one contained an error anywhere along the line the lot would be lost. I have encountered packaged files this bad before and never understand why they make it quite some complex.

So what is the worst case of a compressed package you've encountered?

25th January 2007, 14:02
Nothing that bad although I have had .zip containing hundreds of .rar's containing an ISO many times.

Once something's been compressed, there's little gain to be had compressing it further - the reason that a .bmp will be much smaller Zipped while a png file won't.

25th January 2007, 14:03
I haven't encountered that many problems, but I've seen some that fits the description you gave above. Some people apparantly enjoys splitting archives into tons of different parts for no apparant reason.

Have encountered quite a few CRC errors through the years though, sometimes on big archives like that if I'm not mistaken.

25th January 2007, 14:48
That's pretty bad. I've never had anything like that happen at home, but I've had plenty of zip/rar file disasters at work. The worst is when someone tried to zip 100,000's of thousands of .tif files into one. I tried extracting it on my fastest machines here but no luck. I had to have them re-zip everything in stages.

25th January 2007, 14:50
I have discovered recently that WinRAR will report a corrupt archive when in fact it is not. And you can get around this when extracting. Open the rar in WinUAE. Select the files you wish to extract from the archive and click "Extract To", choose where you wish to extract the files to on the HD and then tick the box labeled "Keep broken files". This will then extract the archive and you will still get the ISOs even though WinRAR thinks they are damaged.

I get that a lot for PS2 games but they always work when burnt to disc so it's not the ISO's but WinRAR that has a problem with the ISO files.

25th January 2007, 15:00
Ive had zips inside a zip when d/loading movies but that jus sounds absurd it certainly cant be beneficial in any way so why do it?

25th January 2007, 15:04
I think the theory is that having a series of split files inside a single compressed files helps prevent corruption when downloading. Most scene releases also make their releases in split sets for easier distribution on newsgroups, which is where most originally get released to begin with.

Demon Cleaner
25th January 2007, 15:10
I never had such problems with compressed files. On Usenet they always use the par system, thus you have one file spread to 10-80 different rar files ending with .rar then .r00, r01 aso. When one file is corrupted, you use a par file that replaces one rar. They already had par2 files, which were a lot better, and now they have even autopar.

Stephen Coates
25th January 2007, 15:53
I was downloading a 4GB (may have been bigger than that) ISO image which was in about 50 or more small RAR files.

That was a pain as I had to go through several web pages to download each one.

Don't normally have that many problems with archives though.

25th January 2007, 16:01
I remember downloads like that. Before P2P networks were common the only two ways to get most stuff was either via a newsnet server or via a pirate website that was hosting the files. The problem with such sites is they are trying to make money from you by having to click through various screens between each download. I remember in the late 90's trying to get the latest copy of Photoshop to do my course work with and that took forever. Having to manually select each of the small rar or zip files, accept lots of screens and adverts before the download would begin, and then start again for the next one. Plus you could only open two connections at once so could only be downloading two parts at a time. That was a slow old process.

Stephen Coates
25th January 2007, 17:23
This was on rapid share.

I think it was quicker than usual and could have several files downloading at once because I was using someone else's premium account :) (they did give me permission to)

Demon Cleaner
25th January 2007, 20:33
Having to manually select each of the small rar or zip files, accept lots of screens and adverts before the download would begin,That's why there are now .nzb (newzbin) files. You d/l it (a few kb), enter your Usenet browser and upload the file and it returns only the files you need. Just 2-3 clicks and you can d/l your file from 10 queues of 2GB each.

25th January 2007, 23:58
That sounds easier. I've not used Usenet/newsgroups for a long time to download anything. I used to when it was just the select and combine method. But not for a couple of years at least.

26th January 2007, 14:35
Speaking of worst compression for a download.. Just remembered all the problems I had when I was swapping/mail trading. Many people used utilities like Quarterback and Diavolo when sending disks. For those you aren't familiar with Quarterback and Diavolo, those are back-up programs for the Amiga. Sometimes, I received like eight disks with lot's of content. A few of those times, I started restoring the data from the disks, and it happened that there was an error on say, disk five. This was really frustrating, because on the older versions, you couldn't proceed with the restoration process. In other words, you lost important files. :( The best way to get stuff was on disks that was formatted with FFS. :)