• Ptsf@lemmy.world
    link
    fedilink
    arrow-up
    166
    arrow-down
    1
    ·
    2 months ago

    Isn’t it all unicode at the end of the day, so it supports anything unicode supports? Or am I off base?

      • thevoidzero@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        2 months ago

        I thought the most mode sane and modern language use the unicode block identification to determine something can be used in valid identifier or not. Like all the ‘numeric’ unicode characters can’t be at the beginning of identifier similar to how it can’t have ‘3var’.

        So once your programming language supports unicode, it automatically will support any unicode language that has those particular blocks.

          • toastal@lemmy.ml
            link
            fedilink
            arrow-up
            8
            ·
            edit-2
            2 months ago

            OCaml’s old m17n compiler plugin solved this by requiring you pick one block per ‘word’ & you can only switch to another block if separated by an underscore. As such you can do print_แมว but you couldn’t do pℝint_c∀t. This is a totally reasonable solution.

          • lunarul@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            I can’t imagine how something like homograph attacks can happen accidentally. If someone does this in code, they probably intended to troll other contributors.

            • NeatNit@discuss.tchncs.de
              link
              fedilink
              arrow-up
              3
              ·
              2 months ago

              Multilingual users have multiple keyboard layouts, usually switching with Alt+Shift or similar key combo. If you’re multitasking you might not realize you’re on the wrong keyboard layout. So say you’re chatting with someone in Russian, then you alt+tab to your source code and you spot a typo - you wrote my_var_xopy instead of my_var_copy. You delete the x and type in c. You forget this happened and you never realized the keyboard layout was wrong.

              That c that you typed is now actually с, Cyrillic Es.

              What do you say, is that realistic enough?

              • lunarul@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                2 months ago

                I use multilingual keyboard layouts, so I know that at least on Windows the selected layout is specific to each window. If I chat with someone in one language, then switch to my IDE, it will not keep the layout I used in the chat window.

                But I also have accidently hit the combination to change layouts while doing something, so it can happen. I’m just surprised that Cyrillic с is on the same key as C, instead of S.

                • NeatNit@discuss.tchncs.de
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  2 months ago

                  I believe there’s a setting for whether it’s global or per-window. Personally I prefer global, because I can’t keep track of more than one state and I absolutely hate the experience of typing something and getting a different language than you expect.

          • thevoidzero@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            Sorry, I forgot about this. I meant to say any sane modern language that allows unicode should use the block specifications (for e.g. to determine the alphabets, numeric, symbols, alphanumeric unicodes, etc) for similar rules with ASCII. So that they don’t have to individually support each language.

            • NeatNit@discuss.tchncs.de
              link
              fedilink
              arrow-up
              1
              ·
              2 months ago

              Oh, that I agree with. But then there’s the mess of Unicode updates, and if you’re using an old version of the compiler that was built with an old version of Unicode, it might not recognize every character you use…

      • lad@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Yes, but it still is about language, not game engine.

        Albeit technically, the statement is correct, since it is more specific.

            • ℍ𝕂-𝟞𝟝@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Godot is neat. There is C# support as well if you find that easier, but coming from Unreal, it’s night and day. I know Unreal has so much more features, but for a hobbyist like me, Godot is much better. It’s just this small executable, and you have everything you need to get creative.

    • Faresh@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 months ago

      I think they exclude some unicode characters from being use in identifiers. At least last I tried it wouldn’t allow me to use an emoji as a variable name.

        • Faresh@lemmy.ml
          link
          fedilink
          arrow-up
          8
          ·
          2 months ago

          That code was C++ or something like that. Not GDScript.

          I tested this on Godot 4.2.1. You can write identifiers using a different writing system other than latin and you are allowed to have emojis in strings, but you aren’t allowed to use emojis in identifiers.

          • histic@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            Ah I’m unfamiliar with most languages I just use python and random others for personal projects

            • bleistift2@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Coding must be a nightmare if you’re choosing programming languages at random 😱

              But you must also be learning quite a lot.

              • histic@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                I’m not choosing at random lol that would be crazy but I mostly use python and have been teaching myself go and some rust

          • lunarul@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            There’s probably a rule that requires variables to start with a letter or underscore. Emoji are nor marked as letters. Something like _👍 will probably work.