TechSpot

Untarring takes forever!

By me(who else?)
Nov 9, 2004
  1. I'm not sure if this is normal, but untarring a small > 100MB file takes a tremendous amount of time. This is on a 2.4Ghz Coppermine. It seems pretty insane...I don't know what's wrong.

    Is there any way to speed this up?
     
  2. Didou

    Didou Bowtie extraordinair! Posts: 4,274

    I doubt you're using a coppermine CPU. They didn't go beyond 1GHz. As for the extraction problem, what amount of time are we talking about here ?

    I can extract sources for the Linux kernel from a tar file in about 5/6 minutes. Is that the kind of time you get ?
     
  3. me(who else?)

    me(who else?) TS Rookie Topic Starter Posts: 387

    It's a Celeron (Coppermine based). I mean it isn't finished after an hour for a 25 MB file. This seems unduly long (this is with tar -x, not graphical like file roller)
     
  4. MYOB

    MYOB TS Rookie Posts: 492

    If its a Celeron over 1.6Ghz or so, its NetBurst Based - Willamette, Prescott, Northwood, etc are all NetBurst processes

    Its NOT on Coppermine, or Tualtin, which is what replaced Coppermine. Coppermine was a P6 based process that was used for PIII's

    Fastest Coppermine Celeron was 1.1Ghz.

    Yours is Northwood-128 or Prescott-256 based.
     
  5. RealBlackStuff

    RealBlackStuff TS Rookie Posts: 6,503

    According to your user-spec, you are doing this on a Toshiba laptop with mobile Celeron 2.2.
    That laptop has only a 40GB harddisk. Could it be that it is nearly full?

    Or are you talking about your Linux server with 233MHz processor and 64MB RAM?
     
  6. Mictlantecuhtli

    Mictlantecuhtli TS Evangelist Posts: 4,345   +11

    Maybe DMA isn't enabled on that hdd, maybe the package is corrupted. I guess you're using ext3 filesystem? You can see what's going on with extraction by using 'v' parameter, for example tar zxfv package.tar.gz.
     
  7. me(who else?)

    me(who else?) TS Rookie Topic Starter Posts: 387

    The HD is nowehere near full (this is the laptop). I don't use up 40 gigs as fast as you would think (I'm at 11Gigs). I'll try the v parameter.
     
  8. MYOB

    MYOB TS Rookie Posts: 492

    A Celeron-M 2.2 couild be P6 based, but it'd be a Dolthan and not a Coppermine...
     
  9. me(who else?)

    me(who else?) TS Rookie Topic Starter Posts: 387

    For whatever reason, when I watched the output, the tarring finished, but it didn't return to the prompt. I never got the reason for that, but the tarring was finishing and not returning to a prompt. :confused:
     
  10. Nodsu

    Nodsu TS Rookie Posts: 5,837   +6

    Perhaps you switched out of the tar process with ctrl+z and then swapped back with "fg"?

    And FYI tar uses virtually no CPU since tar archives are not compressed. They are just concatenated files. If you were talking about uncopressing a tar.gz file then it is another matter.
     
  11. me(who else?)

    me(who else?) TS Rookie Topic Starter Posts: 387

    No, I can actally gzip a 3 gig file in less than an hour. It's untarring it that won't finish! I tried this trick with a larger file, but it still got hung up.
     
Topic Status:
Not open for further replies.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...