Write in a file without changing attributes

Hi,

Sorry I'm French so I speak & write English like a French cow ^^

My title is definitely not very understandable so let me explain my problem :slight_smile:

My OS is Windows and I have a NAS with some "shared folders".
Note: The term "shared folder" on a NAS is quite different from the Windows term. On a NAS this is like a volume created by the administrator.

Anyway, when I move a big file from a shared folder to the same shared folder, the process is very short. But when I do it from a shared folder to another, the process is long... very long. I'm not sure but I think that Windows copy my file over the network like if it was from a network place to another.

Obviously I can do it very faster on my NAS interface but every time I want to move a file, I waste my time in launching my Web browser, login to my NAS, launching the file browser and moving the file...
I can't do it anymore so I wrote a script on my NAS wich monitor a file and when a source and a destination is written on this file, the script move the file directly from the NAS.
The script work great.

My problem is here: I made a custom button on DO with this command:

chcp 850 > nul && echo {filepath$} {destpath} >> \\NAS\SCRIPT_PATH\MONITORED_FILE

It work perfectly too except one thing...

When I open the file I monitor with Notepad++ I see these "attributes":

But when DO write into this file, the "attributes" change:

So my script don't understand special chars and I'm unable to use my button :frowning:

So my question is: How can I write in a file (on a network) without changing the "attributes"?

Thank you for your time.

DOS commands like echo don't support Unicode by default, but you can make them do so using the cmd.exe /U argument.

Try this, in a button as a "Standard Function" (not "MS-DOS Batch Function"):

cmd.exe /U /C echo {filepath$} {destpath} >> \\NAS\SCRIPT_PATH\MONITORED_FILE

Note that the output file with be UTF-16LE without a BOM, so your script on the NAS will need to be able to handle text in that format. That is a limitation of the Windows command prompt and echo command, rather than Opus. It will do that even if you use chcp 65001 to set the codepage to UTF-8. (I guess we are lucky that Microsoft bothered to make cmd.exe and echo support Unicode at all. :slight_smile:)

Thank you for your help.

Still the same with your command :confused:

I have many problems with special characters since a while and I'm not a developer so the only solution I see is to replace the special characters with a custom string char before to write {filepath$} & {destpath} (รฉ -> |e1|, รจ -> |e2|, etc) and do the reverse process in my script.

Not very clean but like I said, I'm having problems with these special characters since a while and I never found a solution.

My command works here.

Did you set the button type to Standard Function?

Yep,

When I say that it doesn't work I mean my script still can't understand special characters.

I'm guessing the script is running on a flavour or Linux / Unix?

The script will most likely be expecting UTF-8 and not expect/understand UTF-16LE by default, but there may be a switch to tell them the encoding of the input file, or a conversion tool you can pipe the file through to turn it into UTF-8 before your main script processes it.

I'm not familiar enough with that environment to know exactly what to do, but there should be a fairly easy solution. Getting the proper characters in the file (instead of "?" characters which you get by default with DOS commands like echo) is half the puzzle, which we've solved; getting the script to understand UTF16-LE is the other half.

Yes I forgot to precise that the NAS is under Linux. I'm not familiar with Linux too but it seems to be a "lite" version and not all the commands are available. However I'm going to search if there is a switch for this problem, thank you :slight_smile:

My NAS script is OK
My Windows batch is OK
My special character two ways conversion is OK

The last thing I need is the custom button...

Obviously, it's not simple to write {filepath$} & {destpath} in a file with a DOS command because of the special chars.

So what is the best way to write them with DO functions?

I can use my special character two ways conversion batch (arguments can be a string or a filename).

The command I gave will put {filepath$} and {destpath} into the output file, including any special characters.

For example, if you drag the output file from my command and drop it on Notepad, which understands UTF-16LE, you'll see the filenames with their special chars in-tact.

Yes your command work fine because I see special characters in notepad or notepad++ but when I print these characters on a DOS CLI or in the Linux terminal, I don't see them :confused:

I found the problem! This is the batch himself which is encoded in OEM 850 instead of ANSI!

ARGH...

I don't think the DOS CLI uses a Unicode-compatible font by default, and I'm not sure how the type command copes with UTF-16LE (might need to set a special codepage, or maybe it just doesn't support it).

The important thing is that the characters are in the text file, and have not been lost or converted into the wrong characters. You just need to convert that text file into a format which your Linux tools can understand, or find a way to make your Linux tools understand UTF-16LE themselves (they might have a switch or a mode of opening the text file).

For example, there is a tool called iconv which you can run on either Linux or Windows which will take a file and change its encoding fro UTF16-LE to UTF8 (or various other combinations, depending on the arguments you give it): en.wikipedia.org/wiki/Iconv

A lot of scripting langauges can do it as well, but I don't know which language you are using so I can't suggest anything specific there.

What batch file? There isn't any batch file at all involved in the command I gave. (Do you mean the batch file or script on the Linux side?)

OK thank you again, I'll see if I can use iconv (I prefer a solution without additional programs :confused: but if iconv works fine and easily, why not).

If I can do all the process only on the Linux side, it will be perfect :slight_smile: (I hate Windows batch language)

Yes I was speaking about my Windows batch which convert every special characters into a string of characters (and the Linux script reverse the process).

Unfortunately, iconv/locale support is highly experimental in uClibc (C library optimized for embedded systems like my NAS) and more often broken so it won't work on my NAS :frowning:

I just want to convert a text! :cry:

If you are doing extra steps on the Windows side already, instead of doing that conversion step I would run something on the Windows side to convert the text file into UTF8. (e.g. The Windows ports of iconv, though I bet there are similar tools that won't require Cygwin or GnuWin32, too.)The Linux side should be able to read the UTF8 version of the file without needing any further conversion.

Yep but I do this I'll have to install a tool on each computer, I would like a simple way to do. I don't think I can convert in UTF8 only with a batch, I'm still searching anyway