Forum › Forums › General › Tips and Tricks › Internationalisation of antiX community scripts
- This topic has 56 replies, 7 voices, and was last updated Mar 8-3:48 pm by Anonymous.
-
AuthorPosts
-
January 20, 2021 at 2:26 pm #50198Member
Robin
I decided t extract the internationalisation section from the most recent version of the script tell-device.sh (just a working title) and make it easyly available and understandable to everybody dealing with scripting here. I had written this internationalisation originally for the aforesaid script only, but I believe there can be some more benefit to the antiX community when this script section is excaved from its original context.
What does this tool do and how it can make translation of community scripts more convenient?
When translating a community script volunteer translator normally has to search the complete script for strings which get displayed at time of execution and differentiate them from very similar looking internal strings not to be touched for functionality reasons. This is a rather time consuming task which entails the danger of rendering the script unusable in this process, with the consequence translator has to debug translated script again. A simple quotationmark put in the wrong place might stop the show.This internationalisation structure is designed to render the modifying the script itself unnecessary by concentrating all the textstrings used in the script in one place, and moreover provide a way of outsourcing the actual translation task to separate files, two for each language.
That way translating script to a new language does only mean to copy an existing language or help file as template to the new language observing a special naming convention and translating the strings inside this file only. These files don’t contain any script code but only text and textstrings, and hence are more clearly layed out in comparison to the underlaying script.
The structure provides some additional information- and testing functionality, so the translator may check whether the new translated language- and helpfile will meet the needs. He can be absolutely sure about not having accidentally altered script code during translation process. Due to the naming convention utilised, script will recognise any new pair of files and use them automatically when they meet the language a user of the script has set his system to.
As long as no dedicated language and help files are present for his language – either country specific or not – script will fall back to builtin string table and internal help document the creator of the script has originally written. So script will be able to run even without any external help and language files at all, using universal english (en).
In case of difficulties running the script with a freshly translated pair of files it is enough to move them out of reach from the script (e.g. to another directory) until errors within them are fixed, which can now simply be done by comparing the files in questions with these of other languages carefully. This is much more easy than to fish for a malfunctional translated textstring between a multitude of codelines inside a script.So I hope we will see more scripts from antiX community translated to as many languages as possible – for the benefit of other users in all countries of the world.
You may want to start this multilanguage script from terminal window, using the option -h or –help first in order to get some more detailed information how this is intended to be used. Commandline: “/bin/bash antiX-multilanguage-support.sh -h”
An tiny example script (“Hello World”) is embedded already, so you may start it in order to evaluate the effect and study the correct way of embedding your own script. Commandline: “/bin/bash antiX-multilanguage-support.sh”. I was able to provide language and help files translated in French, Russian and German language. For users of these languages and English language the example script will start in their language already. All Users of any other languages need to get their specific pair of files translated still. They’ll get built-in universal English language by then.
This multilanguage infrastructure solution is published under conditions of GPL.
Native speakers of English, French and Russian language are welcome to improve my initial translations which you’ll find in the files enclosed, since I am not a native speaker and hence you might discover some rare phrases and unnatural sentence order used within them. All users able to provide a translation to additional languages are welcome to create and upload the suitable pair of files for their languages to this thread.
In case you are not able to create these files on your own, following the instructions given by –help option, you might also simply post the translated strings and help-message as plain text in an answer to this thread.In any case you will get honoured by mentioning you (nickname or clearname at your choice) in a comment line within this antiX community multilanguage infrastructure script. Please let us know in case you don’t like to be mentioned.
Have fun with it.
Robin.- This topic was modified 2 years, 3 months ago by Robin.
Attachments:
Windows is like a submarine. Open a window and serious problems will start.
January 20, 2021 at 5:08 pm #50201Member
marcelocripe
::Hi Robin,
I would like to congratulate you on your initiative to propose a method that allows you to standardize the way in which the scripts are created and also to allow translations to take place in separate files, preventing compromising the original script.
This is much better than I could imagine …
When we started, PPC, you and I did the translations of the scripts individually, file by file for the programs that are not translated in antiX 19.X.
Then, you implemented this technique in “unplugdrive.sh” / “unplugdrive-pt-br.sh” making it internationalizable to several languages in the same file “unplugdrive-intl.sh”, I found this solution fantastic! Now with this new proposal of yours, I can see that the isolated actions can become collective actions that will make antiX international and still allow the expansion of translations into several languages on an ongoing basis.
I hope that antiX developers and programmers will welcome your proposal and see them with good eyes.
marcelocripe
(Original text in Brazilian Portuguese)———-
Olá Robin,
Eu gostaria de parabenizá-lo por sua iniciativa de propor um método que permita padronizar a forma de criação dos scripts e ainda de permitir que as traduções possam ocorrer em arquivos separados, impedindo de comprometer o script original.
Isso é muito melhor do que eu poderia imaginar …
Quando começamos, o PPC, você e eu a fazermos as traduções dos scripts de forma individual, arquivo por aquivo dos programas que não estão traduzidos no antiX 19.X.
Em seguida, você implementou esta técnica no “unplugdrive.sh”/”unplugdrive-pt-br.sh” tornando-o internacionalizável para vários idiomas em um mesmo arquivo “unplugdrive-intl.sh”, eu achei esta solução fantástica! Agora com esta sua nova proposta, eu consigo vislumbrar que as ações isoladas possam se tornar ações coletivas que tornarão o antiX internacional e ainda permitirá a expansão das traduções para vários idiomas de forma contínua.
Eu espero que os desenvolvedores e programadores do antiX acolham a sua proposta e as enxergue com bons olhos..
marcelocripe
(Texto original em Português do Brasil)January 20, 2021 at 6:19 pm #50207Forum Admin
Dave
::Apologies for my ignorance; I only know one language well enough for regular utilization.
Are we meant to source this translation script which in turn sources the correct language file with the list of variables from within our scripts?
I assume this is only meant for bash scripts?
If so does bash not have a built in translation function via gettext? Utilizing the variables:TEXTDOMAIN=@PACKAGE@ export TEXTDOMAIN TEXTDOMAINDIR=@LOCALEDIR@ export TEXTDOMAINDIRIf so could we not write a (or utilize the system default) file which contains the language format in ~/ and source that file for use in the LANG variable before utilizing gettext?
I know how gettext works in python, not 100% certain in bash. How is translation handled via the live boot scripts?
Is there something incorrect when using gettext?Computers are like air conditioners. They work fine until you start opening Windows. ~Author Unknown
January 20, 2021 at 6:35 pm #50209Moderator
Brian Masinick
::@Dave:. I believe that the technique you describe may work for TEXT.
Where I am much less certain is how it will work for other mechanisms, especially LOCALE details, but hopefully we have these from upstream projects, especially Debian.
--
Brian MasinickJanuary 20, 2021 at 6:37 pm #50210Member
Xecure
::I downloaded and looked a bit at the code. It is a nice idea and I think you can definitely use it for your scripts. The problem I see with it is that it is a non-standard approach to translating scripts.
I am no expert, but I have been reading a few of the antiX scripts (ones that come with the system, some that are published here in the forum by the community). After some time, I tried to figure out how bash scripts are set up for translation in Linux (and antiX), and have also developed my own set of tools, which should result in a more standard way for translating.
What I see for most scripts and programs is the use of a .pot (portable object template) file that hosts all the strings that would need translation. From this template file, then a new set of .po (portable object) files are generated for each language (ready for the translation).
If the script changes and new output strings are added, you need to update the .pot template and then also update the .po files for translation, and let the translators know that new lines need translation. This is what I think is the standard procedure.Following comments and recommendations of many forum members, I found that there was an easy way to generate .pot files from a bash script. Ir requires some preparations, but it reduces a lot of the effort for later exporting the .pot file automatically. This is to use a $ sign in front of every output string. Example:
# Original lines of code echo "Processing function X" if [[ "$VARIABLE" == "error" ]]; then echo "Error: incorrect variable" fi # Lines of code with export preparations echo $"Processing function X" if [[ "$VARIABLE" == "error" ]]; then echo $"Error: incorrect variable" fiOnce preparations are finishied, you can export all output strings that you want translated to a .pot file using the command:
bash --dump-po-strings name-of-script | xgettext -L PO -o locale/name-of-script.pot -
This will extract all strings that we previously prepared from the bash script “name-of-script” and save it in a .pot file located inside a “locale” folder I created specifically for storing all translation files.
There probably is an extra option needed to convert the charset to UTF-8, but as I don’t know which, I then use this command to fix the charset of the file.
sed -i '0,/charset=CHARSET/s//charset=UTF-8/' locale/name-of-script.pot
or you can manually edit the file so charset=UTF-8This is the automatic way I found to export all strings that need to be translated to a .pot template file. After this, you can export a .po file for every language you want your script to be translated to with:
msginit -l language-code -i locale/name-of-script.pot -o locale/language-code/name-of-script.po
(this requires a folder “language-code” already created for said language). An example for the German language:
msginit -l de -i locale/name-of-script.pot -o locale/de/name-of-script.poEvery time you update your script, run the export .pot script, and then run each of the export .po per language. If a file is already present (and already contains translated text) for a specific language, you would use this command to merge the new lines with the ones present in your .po file:
msgmerge -U locale/language-code/name-of-script.po locale/name-of-script.potWith this, the .po file will updated (so all the already translated work doesn’t go to waste). You will then have to check any entries with “#, fuzzy”, as these will be ignored because of some problem that was found during the merge (like if you changed one word or the order of words in your original script, the translated string will not match properly with the new string).
Once you are happy with the translations, you can then convert it to a .mo (Machine Object) file, that is the binary that will be used and imported when the script runs.
msgfmt locale/language-code/name-of-script.po -o locale/language-code/name-of-script.moBut, how will the script know where the translation files will be? You will need to set a text domain inside your script. For the example above, we will set the text domain in /usr/share/locale/(language-code/LC_MESSAGES/name-of-script.mo). So, at the beginning of the script, you need to set these variables:
TEXTDOMAINDIR=/usr/share/locale TEXTDOMAIN=name-of-scriptWith this, when the script runs, it will check our locale (language), check if there is a name-of-script.mo file in /usr/share/locale/our-language-code/LC_MESSAGES/ and, if non present, will default to the default language (generally English) of the script.
You can see how I have set this up for the antix-wifi-switch here: https://gitlab.com/nXecure/antix-wifi-switch
The antiX Linux transifex (https://www.transifex.com/anticapitalista/antix-development/) has 53 languages available for translators to collaborate with (maybe this has changed since last year, but when I created the antix-wifi.switch, this was what I could see).
am,am - Amharic ar,ar - Arabic be,be - Belarusian bg,bg - Bulgarian bn,bn - Bengali ca,ca - Catalan cs,cs - Czech da,da - Danish de,de - German el,el - Greek en,en - English es,es - Spanish et,et - Estonian eu,eu - Basque fa,fa - Persian fi,fi - Finnish fil_PH,tl - Filipino (Philippines) (Tagalo) fr,fr - French fr_BE,fr - French (Belgium) gl_ES,gl - Galician (Spain) hi,hi - Hindi he_IL,he - Hebrew (Israel) hr,hr - Croatian hu,hu - Hungarian id,id - Indonesian is,is - Icelandic it,it - Italian ja,ja - Japanese kk,kk - Kazakh ko,ko - Korean lt,lt - Lithuanian mk,mk - Macedonian mr,mr - Marathi nb,no - Norwegian Bokmål nb_NO,no - Norwegian Bokmål (Norway) nl,nl - Dutch nl_BE,nl - Dutch (Belgium) pl,pl - Polish pt,pt - Portuguese pt_BR,pt - Portuguese (Brazil) ro,ro - Romanian ru,ru - Russian sk,sk - Slovak sl,sl - Slovenian sq,sq - Albanian sr,sr - Serbian sv,sv - Swedish th,th - Thai tr,tr - Turkish uk,uk - Ukrainian vi,vi - Vietnamese zh_CN,zh-CN - Chinese (China) zh_HK,zh-TW - Chinese (Hong Kong) zh_TW,zh-TW - Chinese (Taiwan)I saved this file in $HOME/Documents/language.codes and use it as reference for the script I use to auto-generate and update translation files. Here is my translation-update-script, right now set up for antix-wifi-switch:
#!/bin/bash ## This script extracts $"" lines from a file and creates ## ready-to-be-translated files or updates the oneas already created. PATH_OF_SCRIPT="$HOME/gitz/antix-wifi-switch/antix-wifi-switch" # The only line you need to change SCRIPT_DIRECTORY="${PATH_OF_SCRIPT%/*}" TEXTDOMAIN="$(cat "$PATH_OF_SCRIPT" | grep -m1 "TEXTDOMAIN=" | cut -d"=" -f2)" PATH_TO_LOCALE="${SCRIPT_DIRECTORY}/locale" PATH_TO_POT_FILE="${PATH_TO_LOCALE}/${TEXTDOMAIN}.pot" PATH_TO_LOCALE_LIST="$HOME/Documents/language.codes" # Create locale folder if it doesn't exist if [ ! -d "${PATH_TO_LOCALE}/" ]; then mkdir "${PATH_TO_LOCALE}/" fi ## Extract strings bash --dump-po-strings "$PATH_OF_SCRIPT" | xgettext -L PO -o "$PATH_TO_POT_FILE" - ## Replace encoding with UTF-8 sed -i '0,/charset=CHARSET/s//charset=UTF-8/' $PATH_TO_POT_FILE ## Add or update translations while read -r line; do MY_LOCALE="$(echo $line | cut -d"," -f1)" # Make language locale folder if [ ! -d "${PATH_TO_LOCALE}/${MY_LOCALE}/" ]; then mkdir "${PATH_TO_LOCALE}/${MY_LOCALE}/" fi # Make LC_MESSAGES under locale folder if [ ! -d "${PATH_TO_LOCALE}/${MY_LOCALE}/LC_MESSAGES/" ]; then mkdir "${PATH_TO_LOCALE}/${MY_LOCALE}/LC_MESSAGES/" fi MY_LOCALE_PO_FILE="${PATH_TO_LOCALE}/${MY_LOCALE}/LC_MESSAGES/${TEXTDOMAIN}.po" # If .po file doesn't exist, create if [ ! -f "$MY_LOCALE_PO_FILE" ]; then msginit -l "$MY_LOCALE" -i "$PATH_TO_POT_FILE" -o "$MY_LOCALE_PO_FILE" --no-translator # As the file exists, update it else msgmerge -U "$MY_LOCALE_PO_FILE" "$PATH_TO_POT_FILE" fi done <"$PATH_TO_LOCALE_LIST"I hope this helps you understand how I tried to do things so they (try to) follow the “standards” expected by most translation projects. I may have missed a step, as I set this up a few months ago, so I hope anyone reading this may figure this out what is missing.
antiX Live system enthusiast.
General Live Boot Parameters for antiX.January 20, 2021 at 7:04 pm #50215Moderator
Brian Masinick
::@Xecure: Your technique is sound. There is more than one technique to localize, but technologies have improved greatly over the past 20-25 years. Back when I was on an I18N/L10N team in the 90’s there were certainly “conventions” but no fixed, and definitely no automated way of performing end to end localization, though my team did create several scripts to assist in the process, and I wrote (or modified) several of them after one very smart engineer figured out how to do several things.
For text, Dave’s method may work, but your technique appears to be equally sound. If there ARE established STANDARDS or conventions now available, it would be very beneficial to 1) Find them and 2) Utilize them as much as possible; otherwise use a blend of what you and Dave have mentioned.
--
Brian MasinickJanuary 20, 2021 at 7:12 pm #50218Member
Xecure
::otherwise use a blend of what you and Dave have mentioned.
In fact, my method uses Dave’s method (TEXTDOMAIN). I didn’t see Dave’s answer until I published the reply, but it in fact uses what Dave mentioned. Maybe I went into too much detail of how I do it, so if this reply is out of place, please delete it and leave Dave’s reply.
antiX Live system enthusiast.
General Live Boot Parameters for antiX.January 20, 2021 at 7:17 pm #50219Moderator
Brian Masinick
::No, I think that both of you have useful information.
I would like to see if any common techniques have become prevalent since the UNIX systems group I was in did similar work long ago; I HOPE SO!
--
Brian MasinickJanuary 20, 2021 at 7:22 pm #50221Anonymous
::Compared to previous gracious replies from others, I do realize that this post may seem unkind or harsh.
.When translating a community script volunteer translator normally has to search the complete script for strings which get displayed at time of execution and differentiate them from very similar looking internal strings not to be touched for functionality reasons.
isn’t this a false premise?
Both files have to reside in the same folder the script is stored in.
No.
Such a stipulation would be untenable, unworkable, impractical.
Consider: 100+ scripts residing within a given $PATH directory
multiplied by
53 (transiflex) or 400+? (ref: /etc/locale.gen.all) localization files for eachThe FHS Filesystem_Hierarchy_Standard has been honed across decades, and is borne of necessity.
(excerpted from the instructions provided in the OP’s proposed script)
Please start with your script code from line 230
Again, this
( an “insert your script here, into this template” approach, vs a “source(ing)” approach )
this stipulation would be untenable, unworkable, impractical.January 20, 2021 at 8:15 pm #50226Anonymous
::It makes sense to use “talking” names (probably english) instead of using TXT_STRING_01, TXT_02, TXT_03
consider this example:
https://github.com/sparkylinux/wm-logout/blob/master/bin/wm-logout
The task of attempting to debug or maintain an overly-abstracted script like this is an unwelcome chore. Floop1 means blergle… and blergle (proprietary to the lexxxxxicon of this single script) expands to $something…## This script extracts $”” lines from a file and creates
what [
step] is missing_(“thestring”)
gettext uses an underscore as the “function name” translated strings are passed to, right?
(AFAIK, this statement is correct. I’m unaware of any exceptions or variants that might exist.)Regardless whether or not gettext is employed as the string translation function, the use of an underscore seems superior to the use of a $
January 20, 2021 at 8:21 pm #50229Member
Xecure
::_(“thestring”)
gettext uses an underscore as the “function name” translated strings are passed to, right?
(AFAIK, this statement is correct. I’m unaware of any exceptions or variants that might exist.)Regardless whether or not gettext is employed as the string translation function, the use of an underscore seems superior to the use of a $
You are right. It probably is better for latter searching. I will have to study gettext. As I saw a patter in many antiX scripts, including the antixcc.sh script, I though this was the way. I will start studying gettext and how to properly use it.
Thanks for the guidance, skidoo.
antiX Live system enthusiast.
General Live Boot Parameters for antiX.January 20, 2021 at 10:22 pm #50232Member
Xecure
::Some research.
From https://www.gnu.org/software/bash/manual/html_node/Invoking-Bash.html
--dump-po-stringsA list of all double-quoted strings preceded by ‘$’ is printed on the standard output in the GNU gettext PO (portable object) file format. Equivalent to -D except for the output format.
Standard bash tool (probably only useful for bash scripts). This was the one I was using, but it seems that the $”” is deprecated (???)
From https://tldp.org/LDP/abs/html/localization.html
Starting with gettext-0.12.2, xgettext -o – localized.sh is recommended instead of bash –dump-po-strings localized.sh, because xgettext . . .
1. understands the gettext and eval_gettext commands (whereas bash –dump-po-strings understands only its deprecated $”…” syntax)
2. can extract comments placed by the programmer, intended to be read by the translator.
This shell code is then not specific to Bash any more; it works the same way with Bash 1.x and other /bin/sh implementations.So, the way to use gettext (not how skidoo mentions, as I am not able to find the _(“string”) method mentioned) would be using gettext in front of every string (not as practical, as it would make yad code more complex, needing to define each string as a variable before using it in yad).
From http://eyesfreelinux.ninja/posts/internationalising-shell-scripts-with-gettext.html
In the top of the script we put these lines:export TEXTDOMAIN=myscript export TEXTDOMAINDIR=/usr/share/locale . gettext.shIn the above example myscript is an example script name.
[…]
The script author changes typical cho of this syntax:echo 'Press any key to continue'With:
echo $(gettext "Press any key to continue") ; echoWe place an extra echo with no arguments at the end because the gettext function does not add a line-feed.
[…]
If an echoed string contains any shell variables they must be replaced.
For example:echo "Program name: $0"Becomes:
PROGNAME=$0 echo $(eval_gettext "Prog name: \$PROGNAME") ; echoThe eval_gettext function is used instead of gettext and the dolla-sign is escaped with a back-slash.
Again an extra echo is put in after the command to echo a translated string.
[…]
Now we run xgettext against our script to create a file of strings to translate.
We do this in this fashion:xgettext -L Shell -o myscript.pot myscriptThis reads all the echo commands which contain references to the gettext and eval_gettext functions and writes them to a .pot (portable object template) file.
So, for yad strings, it would transform from:
yad --title="Window tittle" --text="text inside your yad dialog" \ --button="OK!gtk-apply!I understand this message":0 \ --button="Cancel!gtk-cancel!I don't agree with the message":1to
yad --title="$(eval_gettext "Window tittle")" --text="$(eval_gettext "text inside your yad dialog")" \ --button="$(eval_gettext "OK!gtk-apply!I understand this message")":0 \ --button="$(eval_gettext "Cancel!gtk-cancel!I don't agree with the message")":1But there should also be a way to use a function for echoing with a very specific name that precedes all output strings. (example, a function named translate_text). The translate_text function could look like
function translate_text(){ local ARGS=$@ echo "$ARGS" }The string could look like:
echo $(translate_text "String of text that needs translating")
And use the command
xgettext -o name-of-script.pot --from-code=UTF-8 --keyword --keyword=translate_text *.sh
to export all strings that come after the “translate_text” word, by using it as a keyword. But I still don’t understand enough, so I cannot assure it will work as I expect.antiX Live system enthusiast.
General Live Boot Parameters for antiX.January 20, 2021 at 11:54 pm #50237Anonymous
::self-correcting my earlier overreaching misstatement:
c/c++ programs calling gettext generally use an underscore as shorthand for the “function name” translated strings are passed to
the bolded text is preserved from the original
https://www.gnu.org/software/gettext/manual/html_node/bash.html
gettext manual
15.5.13 bash – Bourne-Again Shell ScriptGNU bash 2.0 or newer has a special shorthand for translating a string and substituting variable values in it: $”msgid”. But the use of this construct is discouraged, due to the security holes it opens and due to its portability problems.
The security holes of $”…” come from the fact that after looking up the translation of the string, bash processes it like it processes any double-quoted string: dollar and backquote processing, like ‘eval’ does.
In a locale whose encoding is one of BIG5, BIG5-HKSCS, GBK, GB18030, SHIFT_JIS, JOHAB, some double-byte characters have a second byte whose value is 0x60. For example, the byte sequence \xe0\x60 is a single character in these locales. Many versions of bash (all versions up to bash-2.05, and newer versions on platforms without mbsrtowcs() function) don’t know about character boundaries and see a backquote character where there is only a particular Chinese character. Thus it can start executing part of the translation as a command list. This situation can occur even without the translator being aware of it: if the translator provides translations in the UTF-8 encoding, it is the gettext() function which will, during its conversion from the translator’s encoding to the user’s locale’s encoding, produce the dangerous \x60 bytes.
A translator could – voluntarily or inadvertently – use backquotes “...” or dollar-parentheses “$(…)” in her translations. The enclosed strings would be executed as command lists by the shell.The portability problem is that bash must be built with internationalization support; this is normally not the case on systems that don’t have the gettext() function in libc.
Hmm, a single underscore (as a variable name) isn’t possible in bash.
https://man7.org/linux/man-pages/man1/bash.1.html
Shell Variables The following variables are set by the shell: _ At shell startup, set to the pathname used to invoke the shell or shell script being executed as passed in the environment or argument list. Subsequently, expands to the last argument to the previous simple command executed in the foreground, after expansion. Also set to the full pathname used to invoke each command executed and placed in the environment exported to that command. When checking mail, this parameter holds the name of the mail file currently being checked.grep -inr gettext /usr/local/lib
^—v
4 (of 8 total) gettext -related functions defined within antiX-common.shgt() { gettext "$1" } pfgt() { local fmt="$1" && shift printf "$(gettext "$fmt")" "$@" } [ "$Static_antiX_Libs" -o "$LOADED_STYLE" ] || \ source $antiX_lib/antiX-style-default.sh gt_ac() { gettext -d antiX-bash-libs "$@" } pfgt_ac() { local fmt="$1" && shift printf "$(gettext -d antiX-bash-libs "$fmt")" "$@" }…so, an improved solution would perform validation to at least guard against (by rejecting or removing/replacing or escaping) backtick characters, eh
January 21, 2021 at 12:49 am #50238MemberRobin
::Hello all,
Compared to previous gracious replies from others, I do realize that this post may seem unkind or harsh.
No, skidoo, you are right. Exactly these points I was hoping to overcome in further versions (be aware, it is 0.30 only) BUT, I don’t think I’ll go on in this path after what I learned from your replies.
All of you have set me on the right trail now. Thank you. My knowledge about this is is some decades outdated, and actually I had never come across this gettext command in bash. The only thing I saw was people here around struggling with translations of scripts string by string inside the script. Exactly that is what I wanted to prevent community script users from.I will have to study gettext. As I saw a patter in many antiX scripts, including the antixcc.sh script, I though this was the way. I will start studying gettext and how to properly use it.
Thanks for the guidance, skidoo.
So again, also from me: Thank you very much, skidoo. And thank you Xecure for your detailed explanations. What manpage and help of gettext presented to me was that confusing I considered it to have a completely different scope. “die Übersetzung zu SCHLÜSSEL aus TEXTBEREICH anfordern” doesn’t make any sense even if it is German language, and “Wenn der Parameter TEXTBEREICH nicht angegeben wurde, wird der Bereich durch die Umgebungsvariable TEXTDOMAIN bestimmt. Wenn der Katalog dieses Bereiches sich nicht im Standardverzeichnis des Systems befindet, kann durch die Umgebungsvariable TEXTDOMAINDIR ein anderes Verzeichnis angegeben werden.” is quite like kind of gibberish. There is nowhere mentiond a po file, or even what “textdomain” and “textbereich” means, which “Katalog” is meant.
But since I know by now there IS an established way of providing this service, I’ll study it, just as you, Xecure already did. Hopefully it is suitable for community scripts also. My point is: Yes, the way you directed me to is suitable for me. But even if I am not a programmer, don’t consider me as a standard user, since I am trained in reading (and writing) source code and manpages (many years ago I had to study Fortran F77 for some reason) as long it’s about procedural languages. (I don’t have any clue in object oriented languages) I will carefully read what you have posted in detail, do some additional research, and will happily walk on this new path along.But do you really think a standard user will be able to create his own po type translation file to a community script? Where should he look for the strings and texts to be translated? And where should he put them after he did the work? Does he even know about the existence of po files? I didn’t… Maybe this could be worth to investigate and provide a convenient solution for antiX community (if there are not some tools out there providing exactly this functionality already): Maybe something like a simplified local transifex for single user, based on spacefm or yad… Only an idea.
Or don’t I see the forest for the trees here? I count on your opinion.
Robin
- This reply was modified 2 years, 3 months ago by Robin.
- This reply was modified 2 years, 3 months ago by Robin. Reason: typo
- This reply was modified 2 years, 3 months ago by Robin.
Windows is like a submarine. Open a window and serious problems will start.
January 21, 2021 at 2:08 am #50252Anonymous
::When translating a community script volunteer translator normally has to search the complete script for strings which get displayed at time of execution and differentiate them from very similar looking internal strings not to be touched for functionality reasons.
Is this a false premise?
What you described (“has to search the complete script for strings”)
appears to be true, at least in some cases.
Until you called attention to this today, I had never noticed:If one visits the “source code” repository https://gitlab.com/antiX-Linux/add-desktop-antix/-/tree/master/locale
or
downloads the “source package” http://repo.antixlinux.com/buster/pool/main/a/add-desktop-antix/add-desktop-antix_0.3.23.tar.xzonly the encoded *.mo files are provided !
Given the absence of plaintext intermediary *.po files, yes, any potential translator not aware of (and enrolled in using) transifex would need to “search the complete script for strings”. Additionally, according to someone’s recent post (marcelocripe?), a transifex set does not yet exist for every script.

-
AuthorPosts
- You must be logged in to reply to this topic.