FW: Failed to post to CodePlex.com project 7zbackup

Feb 19, 2010 at 10:57 PM

Hello Anlan,

I see your point now and understand why you chose that solution. There is a new version of 7zip (V9.10 beta) and I was wondering if it still has the same ‘issue/feature’ since it allows for a lot more in selecting the files to zip. Maybe the issue is or will be solved in this or future releases? Did not test jet for the issue but I will let you know if I come around.

There is however another route that you could use and that is bypass 7zip selections by creating a filelist using the PowerShell expressions which appear to be very powerfull but that would mean you have to make a substantial change to the script. I have seen some scripts with Powerfull selection mechanisms. The question here is what worse redoing the file selection or using junction.

Anyhow, thanks for your clarification and I will mail soon about my findings.

Kind regards,

Gert-Wim

From: Anlan [mailto:notifications@codeplex.com]
Sent: vrijdag 19 februari 2010 20:05
To: codeplex@scheppink.com
Subject: Re: RE: Failed to post to CodePlex.com project 7zbackup [7zbackup:82945]

From: Anlan

gwscheppink wrote:

Now it comes to my mind that I do not fully understand why you use the junction tool. On a local system you could use “subst” but don’t see why (probably to prevent the usage of UNC paths). On remote drives “net use” to temporarily map to a share or subfolder of that share. We use this in our environment for the current archiving . This should do the trick too without ever using junction.exe.

You could test for this behavior in the script and warn users about this.

Hi Gert, hope everything is ok on your side.

I hope you'll appreciate some further implementations I made to the script following your suggestions. (sorry for the project to develop so slowly but ... I am following it in my free-time).

To answer your question about the use of junctions instead of NET USE or SUBST... This is due to how 7-zip behaves when scanning directories in search for files to archive. Imagine the following scenario: you have two directories on your source drive named \dir1 and \dir2. Each one of those contains (just an example) a file named sample.txt so your tree command will report something like this:

C:.
├───Dir1
Sample.Txt

└───Dir2
Sample.Txt

You want to archive all these contents in a single compressed file, with a single pass ... well if you pass 7-zip a list of directories like C:\Dir1\* and C:\Dir2\* you will receive back a Duplicate File Name error (as sample.txt is encountered twice).
The "problem" is well known to 7-zip's developer Igor Pavlov: he says it's a behavior by design which can be bypassed making 7-zip search "relative" to the files.
In other words you should "group" your multiple sources under a "master root" and make the scan relative. Like this:

C:.
└───MasterDir
├───Dir1
Sample.Txt

└───Dir2
Sample.Txt

In this way, you enter MasterDir and make it the working directory threfore passing 7-zip a list like this:

Dir1\Sample.txt
Dir2\Sample.txt

This overrides the problem of duplicate file name. The only option I had to "rewrite" temporarily the file system during backup operations ... was to drop-in junction points. In such way you can create a customized "view" of your file-system and let 7-zip scan from a single root. SUBST, can't do this as it lets you map one drive letter to a single path, just as well as NET USE.

Hope my explanation makes sense to you.

Read the full discussion online.

To add a post to this discussion, reply to this email (7zbackup@discussions.codeplex.com)

To start a new discussion for this project, email 7zbackup@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe on CodePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online at CodePlex.com

Coordinator
Feb 20, 2010 at 11:02 AM
gwscheppink wrote:

Hello Anlan,

There is however another route that you could use and that is bypass 7zip selections by creating a filelist using the PowerShell expressions which appear to be very powerfull but that would mean you have to make a substantial change to the script. I have seen some scripts with Powerfull selection mechanisms. The question here is what worse redoing the file selection or using junction.

Hi Gert,

unfortunately 7-zip 9 beta is affected by the same issue.

To answer your suggestion about creating a file list: well ... this is exactly what this script does. 7-zip can only scan "relative" subdirs in search of files using file masks. The script addresses two goals it makes all sources "relative" to a root (junctions) and creates a list of files using several directives (Archive attribute state, regular expression against file names, regular expressions against directory names, rules to stop recursion of specific subdirs, file ages ... maybe others will come soon). If you look at the documentation page you'll find all selection directives are used to build a list of files (an index) to pass 7-zip for archiving. So 7-zip does not have to cope with selection as it will receive a precise list of files to be archived: only the ones in the list (which are selected by the script, not by 7-zip) are archived. Junctions and selection do work in synergy to allow 7-zip do all the job in a single pass.

Regards.