Poor rar extract performance

I've noticed a significant degradation of file extract performance when using Directory Opus, especially RAR files. So I decided to run a test. With a 1.2gb RAR archive, I first extracted it using Opus internal extractor and then with WinRAR extractor. It took Opus 26 seconds to extract it and WinRAR only 10 seconds. I'd love to just use WinRAR and call it a day, but I'd lose the convenience of Opus pre-populating the destination folder to the second pane. Is there anyway I could get the best of both worlds? Maybe a secret way to direct Opus's internal extractor to use the WinRAR binary for RAR files instead of its own? I've not tested zip or 7z performances yet since I mostly work with RAR files but whatever binary gives me the best performance for each type I wish I could configure Opus to use if present and only fall back to the "turtle"-paced internal extractor if all else fails.

Extraction already happens through unrar.dll.

Assuming WinRAR and unrar.dll use the same code, and you're extracting to a local drive which doesn't need UAC elevation to write to, speed differences are more likely to come from either caching (extracting the same archive twice will likely be faster the second time) or antivirus treating the two programs differently.

If unrar.dll and WinRAR.exe have significant speed differences in general then it's something to ask the Rar developers about.

Example archives that you find slow in one thing and fast in another would also help here, else we're just left guessing where the difference may be.

Ok I ran the test in reverse and got 11 seconds with WinRAR and 25 seconds with Opus. It doesn't matter the rar archive. The bigger it is the more noticeable the time difference. I'm typically extracting MSFS scenery updates some of which contain a lot of small files in a otherwise large archive. I extracted the file simply to a subfolder of the folder it was in using both extractors one after another, deleting the subfolder after each of course. Maybe WinRAR uses multithreading? I am using the current v6.21 of WinRAR registered if that is what is making the difference. How can you get away with using their dll? or is it your own version? If its your own version, that is likely why it is slower. But I don't know what the issue is.

If it's lots of small files (which definitely isn't "all archives", so it's an important detail), the difference may be in things we do per-file that WinRAR isn't doing. The options in Preferences / File Operations / Copy Attributes can turn much of that off.

Unrar.dll is a DLL the RAR developers provide which anyone can use.

Multithreading is unlikely to be a factor when extracting an archive (only when compressing one, for some newer algorithms), and is something Unrar.dll would typically handle by itself if it needed to.

I only have checked 'Clear read-only flag when copying from CDs', 'Preserve the attributes of copied files', and 'Preserve the timestamps of copied files'. The first option is not applicable because I'm not copying from a CD. I cleared the last 2 even though WinRAR does reserve timestamps of copied files. But even with those changed and restarting Opus, it only shaved 4 seconds off of the time to takes for Opus to extract the file, still much slower than using WinRAR directly. Not sure what file WinRAR is using when I initiate the extract using its right-click context menu... but for some reason I doubt it's the same unrar.dll you are using. I could be wrong though, but if it is different, I wish I could point Opus to it somehow.

You can copy another unrar.dll over the one under the Opus program files folder. They are generally compatible with each other (as long as they aren't older than the one we're using, at least).

I would test what other tools do first, though.

There's also the option to use 7z.dll instead of unrar.dll (Preferences / Zip & Other Archives / Archive and VFS Plugins, then configure the RAR item there).

I'd be surprised if that's faster but it's always possible.

Extracting the rar file with 7z directly gives me the same performance as winrar itself to extract it. So it's only Opus that is sluggish when it comes to extracting the rar files. Next I tried this... I compressed all the files in this folder into a 7z file.. basically I converted the archive from rar to 7z. Extracting this 7z archive was only a few seconds, nearly instantaneous, on both 7z and Opus, but slower, back to the standard 10 sec, with WinRar. Wish we could choose the extractor Opus used. I think I'd point it to 7z. Maybe you just have unrar.dll configured wrong in some way when it comes to unpacking rar files.

You can. See my post above yours.

Changing to use 7z.dll in preferences for RAR files instead of unrar.dll gives me nearly the performance of using winzip directly.. 12 seconds vs 10 seconds. So that is a good fix but I'd like to see if I can get it to actually match the performance. Where do I put the new dll versions?.. in this path:
C:\Program Files\GPSoftware\Directory Opus\VFSPlugins

And what about the 32bit version.. do I have to replace that in the 32bit folder? Which version is the Opus plugin actually using? I mean it says unrar.dll in the preference settings, not unrar64.dll? I'm using 64x Opus. I tried replacing both unrar.dll and unrar64.dll in the respective plugin folders, along with the 64x version of 7z.dll (I don't have 32bit version of that installed to get the 32x version to replace it). However after restarting Opus I seen no performance difference.. still 21sec with Opus internal unpacker / unrar.dll and 12 sec instead of 10 sec when configured to use 7z.dll. Maybe I'm not putting the dll's in the right folders?

Yeah I've tried everything. It's not a matter of file version. Unrar.dll (actually unrar64.dll) is simply much slower at unpacking for me verses using WinRAR directly, even after I replace the dll with the most current edition. Using 7z.dll gives me the performance of 7z directly, but 7z seems to be slightly slower in general when handling RAR files.. maybe 20% slower verses WinRAR on the file I tested. I guess good enough until the issue can be resolved. Others should test this as well to confirm my findings. If so, then maybe the problem was introduced in one of the more recent versions of unrar.dll, as I found a thread on here where people were praising unrar.dll be faster than 7z.dll back in 2016. I find it hard to believe they would intentionally program the dll to be slower now than their main product.. at least not a performance degradation this significant. It's almost twice as slow to extract.

I suggest using WinRAR if a few seconds of performance is that important to you.

Using 7z.dll for just a small performance hit is acceptable for the convenience of using Opus internal unpacker. But if I were the Opus programmer, I'd be curious to confirm if I too were getting a big noticeable performance hit with unrar.dll. And if so, get back with the developer to ask why. I mean back in 2016 he seemed willing to help.

That’d be easier if you provided an example archive so we could verify what you’re seeing. :slight_smile:

Please also link your account if you want to continue this.

Better Late than Never...

We have added native support for additional archive formats, including tar, 7-zip, rar, gz and many others using the libarchive open-source project. You now can get improved performance of archive functionality during compression on Windows.

The rar file does not matter. It's more about the size. You need one that is big enough with enough files so the decompression lasts long enough to time it, especially if you are using an M.2 drive like I am. This requires an archive 500MB+ which I'm not going to attempt to upload to the forum. However I found an openly available test rar you can download that illustrates the problem. Download the 616MB sample.rar from here:

https://figshare.com/articles/dataset/sample_rar/5487316

Use the Directory Opus Archives right-click context menu to extract to it "sample" folder. Do this once with unrar.dll enabled and again with 7z.dll. For me it takes 9 secs with unrar.dll and 5 secs with 7z.dll. Now try it with WinRAR. For me, this took 5 secs as well. So for this file the performance using 7z.dll was the same as WinRAR since it's a relatively smaller archive. But the problem is, why isn't the performance of unrar.dll the same as WinRAR or at least relatively close? I'm seeing a 80% performance hit with unrar.dll currently, even with the latest unrar64.dll .

As for as my account, I'm only on week one of evaluating Directory Opus. It's a great product and I have every intent on registering it, hence why I'm here trying to make it even better. Also, I found it is so useful, I can't imagine running Windows without it. But it's to my advantage to wait as long as possible to register because v12 has been out for several years now. We don't know when v13 might be released. v13 could be released later this year and you say, everyone who registered v12 from July 1st forward can upgrade for free. So hence why holding out for my max 60 days wouldn't hurt. :wink: Perhaps you have a beta v13 I could test, or if there is nothing is imminent for the near future, I could go ahead and register. Let me know.

I am seeing similar results, unrar.dll is substantially slower than winrar.exe:

7z       5980 ms
winrar   6093 ms
unrar   13833 ms

UnrarBench extracts /downloads\sample.rar and logs the duration:

function OnInit(initData) {
    initData.name = 'UnrarBench';
    initData.version = '2023-05-25';
    initData.url = 'https://resource.dopus.com/t/poor-rar-extract-performance/44480';
    initData.default_enable = true;
    initData.min_version = '12.0';
}

function OnAddCommands(addCmdData) {
    var cmd = addCmdData.AddCommand();
    cmd.name = 'UnrarBench';
    cmd.method = 'OnUnrarBench';
    cmd.hide = false;
    cmd.icon = 'script';
}

function OnUnrarBench(scriptCmdData) {
    var cmd = scriptCmdData.func.command;
    var fsu = DOpus.FSUtil();
    var wsh = new ActiveXObject('WScript.Shell');
    var exeWinRAR = fsu.Resolve('{apppath|winrar}winrar.exe');
    var item = fsu.GetItem(fsu.Resolve('/downloads\\sample.rar'));

    var scriptStartTime = DOpus.Create().Date();
    var targetFolder = item.path + '\\' + scriptStartTime.Format('D#yyyyMMdd-T#HHmmss');
    cmd.RunCommand('CreateFolder NAME="' + targetFolder + '"');

    cmd.deselect = false;

    cmd.RunCommand('Set UTILITY=otherlog');
    DOpus.ClearOutput();

    var cmdLine = 'Copy EXTRACT FILE="' + item + '" TO="' + targetFolder + '"';
    DOpus.Output(cmdLine);
    cmd.RunCommand(cmdLine);

    // var cmdLine = '"' + exeWinRAR + '" x "' + item + '" ' + targetFolder + '"';
    // DOpus.Output(cmdLine);
    // wsh.Run(cmdLine, 1, true);

    var scriptEndTime = DOpus.Create().Date();

    var duration = (scriptEndTime + scriptEndTime.ms - scriptStartTime - scriptStartTime.ms);
    DOpus.Output('Script running time: ' + duration + ' milliseconds');

    cmd.RunCommand('FileType NEW=.txt PATH="' + targetFolder + '" NEWNAME="norename:' + duration + ' milliseconds.txt"');
}

CommandUnrarBench.js.txt

3 Likes

Handy script!

Using the provided sample.rar, I can confirm similar timings.

It's interesting that the archive is using a very old version of the RAR format, and if you extract and recompress it with a more recent version of WinRAR, the result decompresses faster. (Using unrar.dll, ~12s for the original, ~9s for the recompressed version. Using WinRAR.exe or 7z.dll, ~6s for the original archive. So just recompressing it to a newer RAR format takes off half the time difference.) Another example of why it matters to use the same test archive when verifying results.

Switching back to the original archive, I did a test with the RARLab's sample UnRar.exe code (which uses UnRar.dll). That is not slow, so the issue isn't entirely inherent to UnRar.dll. But I suspect the problem is when UnRar.dll is used in callback mode. (Opus needs to use callbacks to work with things like UAC and FTP destinations. The UnRar.exe sample doesn't need that as it always writes its files directly, and fails if the destination needs UAC or isn't a real directory.)

Doing a little debugging, the callback we give to UnRar.dll is being called with new data (UCM_PROCESSDATA) in very small chunks as the files decompress. Small chunks are inevitable with small files, but they're also happening with the larger ones. I might be wrong, but I suspect the bottleneck is there, and the reason 7z.dll is faster is that it's buffering the data better, sending fewer, larger chunks instead of many more tiny chunks.

Even if I make our callback do absolutely nothing, and not actually write the data to disk, the speed barely changes at all. While I haven't investigated in great detail, that suggests the overheads are on the UnRar.dll side somewhere. (Which makes sense, as the actual code for using UnRar.dll and 7z.dll in Opus is virtually identical, with a small wrapper converting one interface to the other. Since 7z.dll is fast in the same situation, the issue is likely either in the wrapper or UnRar.dll, and I've tried making the wrapper do almost nothing without much change in timings.)

Since it doesn't seem that the bottleneck is on our side, and using the option for 7z.dll instead of unrar.dll solves things, I suspect there's nothing we can do on our side to speed things up when using UnRar.dll, unless I've overlooked something.

5 Likes

Since the problem appears to be within unrar.dll, maybe the problem was just introduced in one of the later versions of the dll. I find it hard to believe it always existed, and I was just the first to notice this. If I could find a source for past versions, I'd try them out. Of course it might break Opus in some way, but if the most recent fast version could be found, and it wasn't substantially different than the current version, we could just use that, for now. There are rar archives where winrar is faster than using 7z.