• expr@programming.dev
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    2
    ·
    2 years ago

    I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.

    • RandomWalker@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      2
      ·
      2 years ago

      I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.

      • doctordevice@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 years ago

        My experience (bachelor’s in math and physics, but I went into physics) is that if you want to be clear about including zero or not you add a subscript or superscript to specify. For non-negative integers you add a subscript zero (ℕ_0). For strictly positive natural numbers you can either do ℕ_1 or ℕ^+.

      • 𝓔𝓶𝓶𝓲𝓮@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        2 years ago

        I hate those guys. I had that one prof at uni and he reinvented every possible symbol and everything was so different. It was a pita to learn from external material.

      • Leate_Wonceslace@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        2 years ago

        they’ll probably make up their own symbology just because it’s slightly more convenient for their proof

        I feel so thoroughly called out RN. 😂

      • gens@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.

          • gens@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 years ago

            I feel they have an image to maintain, but i also feel they would sell out for enough money. So… tell me if you make it.

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          2 years ago

          Yeah, interoperability. Like every software implementation of natural numbers that include 0.

          • WldFyre@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            How programmers utilize something doesn’t mean it’s the mathematical standard, idk why ISO would be a reference for this at all

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      2 years ago

      Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.

      FWIW I was taught that the inclusion of 0 is a French tradition.

      • Leate_Wonceslace@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        2 years ago

        I’m an American mathematician, and I’ve never experienced a situation where 0 being an element of the Naturals was called out. It’s less ubiquitous than I’d like it to be, but at worst they’re considered equally viable conventions of notation or else undecided.

        I’ve always used N to indicate the naturals including 0, and that’s what was taught to me in my foundations class.

      • xkforce@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        4
        ·
        2 years ago

        The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.

      • holomorphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        2 years ago

        I have yet to meet a single logician, american or otherwise, who would use the definition without 0.

        That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.

      • pooberbee (they/she)@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 years ago

        This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.

  • dogsoahC@lemm.ee
    link
    fedilink
    English
    arrow-up
    59
    ·
    2 years ago

    Well, you can naturally have zero of something. In fact, you have zero of most things right now.

  • affiliate@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    2 years ago

    the standard (set theoretic) construction of the natural numbers starts with 0 (the empty set) and then builds up the other numbers from there. so to me it seems “natural” to include it in the set of natural numbers.

    • Leate_Wonceslace@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 years ago

      On top of that, I don’t think it’s particularly useful to have 2 different easy shorthands for the positive integers, when it means that referring to the union of the positive integers and the singleton of 0 becomes cumbersome as a result.

  • baseless_discourse@mander.xyz
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    2 years ago

    I think if you ask any mathematician (or any academic that uses math professionally, for that matter), 0 is a natural number.

    There is nothing natural about not having an additive identity in your semiring.

  • l10lin@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    2 years ago

    Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.

    • blind3rdeye@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 years ago

      In some countries, zero is neither positive nor negative. But in others, it is both positive and negative. So saying the set of natural number is the same as non-negative [integers] doesn’t really help. (Also, obviously not everyone would even agree that with that definition regardless of whether zero is negative.)

      • gregorum@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 years ago

        There’s a frog on the log on the hole on the bottom of math. There’s a frog on the log on the hole on the bottom of math. A frog. A frog. There’s a frog on the log on the hole on the bottom of math.

    • RandomWalker@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 years ago

      Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.

  • Codex@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    2 years ago

    I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.

  • AppleMango@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    2 years ago

    I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0

    • baseless_discourse@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      2 years ago

      Oh no, are we calling non-negative integers “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.

      On the other hand, changing integer to whole number makes perfect sense.

    • lowleveldata@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 years ago

      As a programmer, I’d ask you to link your selected version of definition of natural number along with your request because I can’t give a fuck to guess

    • ramble81@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 years ago

      Did we need to invent it, or did it just take that long to discover it? I mean “nothing” has always been around and there’s a lot we didn’t discover till much more recently that already existed.

      • darthelmet@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Does “nothing” “exist” independent of caring what there is nothing of or in what span of time and space there is nothing of the thing?

        There’s always been “something” somewhere. Well, at least as far back as we can see.

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      There can’t really be an argument either way. It’s just a matter of convention. “Natural” is just a name, it’s not meant to imply that 1 is somehow more fundamental than -1, so arguing that 0 is “natural” is beside the point

  • TheObviousSolution@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    2 years ago

    Zero grew up from the seeds of the undefined, just like negative numbers and people who refuse to accept that the square root only has one value. Undefined is a pathway to many abilities some would consider unnatural.

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.