I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
My experience (bachelor’s in math and physics, but I went into physics) is that if you want to be clear about including zero or not you add a subscript or superscript to specify. For non-negative integers you add a subscript zero (ℕ_0). For strictly positive natural numbers you can either do ℕ_1 or ℕ^+.
I hate those guys. I had that one prof at uni and he reinvented every possible symbol and everything was so different. It was a pita to learn from external material.
they’ll probably make up their own symbology just because it’s slightly more convenient for their proof
I feel so thoroughly called out RN. 😂
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Can I pay them to make my dick length the ISO standard?
I feel they have an image to maintain, but i also feel they would sell out for enough money. So… tell me if you make it.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
How programmers utilize something doesn’t mean it’s the mathematical standard, idk why ISO would be a reference for this at all
Yeah dont do that.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
I’m an American mathematician, and I’ve never experienced a situation where 0 being an element of the Naturals was called out. It’s less ubiquitous than I’d like it to be, but at worst they’re considered equally viable conventions of notation or else undecided.
I’ve always used N to indicate the naturals including 0, and that’s what was taught to me in my foundations class.
Of course they’re considered equally viable conventions, it’s just that one is prevalent among Americans and the other isn’t.
I think you’re using a fringe definition of the word “fringe”.
I’m not.
The US is one of 3 countries on the planet that still stubbornly primarily uses imperial units. “The US doesn’t do it that way” isn’t a great argument for not adopting a standard.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.
I did say mathematician, not logician.
Logicians are mathematicians. Well, most of them are.
But not all mathematicians are logicians.
Logically.
This isn’t strictly true. I went to school for math in America, and I don’t think I’ve ever encountered a zero-exclusive definition of the natural numbers.
It is true.
Well, you can naturally have zero of something. In fact, you have zero of most things right now.
How do you know so much about my life?

How do I have anything if I have nothing of something?
I have seen arguments for zero being countable because of some transitive property with not counting still being an option in an arbitrary set of numbers you have the ability to count to intuitively.
the standard (set theoretic) construction of the natural numbers starts with 0 (the empty set) and then builds up the other numbers from there. so to me it seems “natural” to include it in the set of natural numbers.
On top of that, I don’t think it’s particularly useful to have 2 different easy shorthands for the positive integers, when it means that referring to the union of the positive integers and the singleton of 0 becomes cumbersome as a result.
I think if you ask any mathematician (or any academic that uses math professionally, for that matter), 0 is a natural number.
There is nothing natural about not having an additive identity in your semiring.
Definition of natural numbers is the same as non-negative numbers, so of course 0 is a natural number.
In some countries, zero is neither positive nor negative. But in others, it is both positive and negative. So saying the set of natural number is the same as non-negative [integers] doesn’t really help. (Also, obviously not everyone would even agree that with that definition regardless of whether zero is negative.)
But -0 is also 0, so it can’t be natural number.
In school i was taught that ℕ contained 0 and ℕ* was ℕ without 0
I was taught ℕ did not contain 0 and that ℕ₀ is ℕ with 0.
ℕ₀* is ℕ with 0 without 0
Aren’t you guys taught about a tging called whole numbers??
Wait, I thought everything in math is rigorously and unambiguously defined?
There’s a hole at the bottom of math.
There’s a frog on the log on the hole on the bottom of math. There’s a frog on the log on the hole on the bottom of math. A frog. A frog. There’s a frog on the log on the hole on the bottom of math.
Rigorously, yes. Unambiguously, no. Plenty of words (like continuity) can mean different things in different contexts. The important thing isn’t the word, it’s that the word has a clear definition within the context of a proof. Obviously you want to be able to communicate ideas clearly and so a convention of symbols and terms have been established over time, but conventions can change over time too.
Platonism Vs Intuitionism would like a word.
Yes, and like any science it gets revisited and contested periodically.
I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.
Weird, I learned the exact reverse. The recommended mnemonic was that the whole numbers included zero because zero has a hole in it.
N0
I have been taught and everyone around me accepts that Natural numbers start from 1 and Whole numbers start from 0
Oh no, are we calling non-negative integers “whole numbers” now? There are proposals to change bad naming in mathematics, but I hope this is not one of them.
On the other hand, changing integer to whole number makes perfect sense.
As a programmer, I’m ashamed to admit that the correct answer is no. If zero was natural we wouldn’t have needed 10s of thousands of years to invent it.
As a programmer, I’d ask you to link your selected version of definition of natural number along with your request because I can’t give a fuck to guess
I truly have no idea what you’re saying.
I think you’re considering whether zero is somehow “naturally-occurring”, while others may be considering the concept of a natural number, which is a nonnegative integer.
Did we need to invent it, or did it just take that long to discover it? I mean “nothing” has always been around and there’s a lot we didn’t discover till much more recently that already existed.
Does “nothing” “exist” independent of caring what there is nothing of or in what span of time and space there is nothing of the thing?
There’s always been “something” somewhere. Well, at least as far back as we can see.
IMO we invented it, because numbers don’t real. But that’s a deeper philosophical question.
It is a natural number. Is there an argument for it not being so?
Well I’m convinced. That was a surprisingly well reasoned video.
Thanks for linking this video! It lays out all of the facts nicely, so you can come to your own decision
There can’t really be an argument either way. It’s just a matter of convention. “Natural” is just a name, it’s not meant to imply that 1 is somehow more fundamental than -1, so arguing that 0 is “natural” is beside the point
Zero grew up from the seeds of the undefined, just like negative numbers and people who refuse to accept that the square root only has one value. Undefined is a pathway to many abilities some would consider unnatural.
So 0 is hard. But you know what? Tell me what none-whole number follows right after or before 0. That’s right, we don’t even have a thing to call that number.
±ε
I think p-adic has that
Zero is a number. Need I say more?
Bbbutt… Is it a “Natural” number?













