Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[css-values] Make top-level NaN be invalid at computed value time instead of ∞ #7067

Closed
Loirooriol opened this issue Feb 17, 2022 · 32 comments

Comments

@Loirooriol
Copy link
Contributor

From https://drafts.csswg.org/css-values/#top-level-calculation

If a top-level calculation [...] would produce a value whose numeric part is NaN, it instead act as though the numeric part is +∞.

This seems very random to me, why ∞ instead of e.g. -∞, 0, 1, or 3.14159?

This topic was already touched in #4954. @LeaVerou said

I don't understand the sentence "it produces a NaN, which is censored into an infinity". Mathematically, 0/0 does not tend to infinity and is an indeterminate form.
It seems more reasonable that it would be invalid at computed value time, or something along these lines.

And during the CSSWG discussion:

  • oriol: One argument about this, maybe we could say that saying the initial value is not possible, but we could resolve it to NaN and keep NaN into the outer calc, then bubble it up, and say if the number resolves to NaN it behaves as the initial value.
  • TabAtkins: If a top-level calculation ends up being NaN, the property would be invalid at computed value time. It's possible.

But AFAIK the discussion didn't continue.

I do think that invalid at computed value time seems more reasonable than ∞.

@Loirooriol Loirooriol added the css-values-4 Current Work label Feb 17, 2022
@LeaVerou
Copy link
Member

LeaVerou commented Feb 17, 2022

I agree. That also troubled us when we recently looked at this in the TAG.

Also, errors don't always happen during development, where they can be caught by the author and fixed. Often an error is a result of unexpected user input, and can happen at any time, so producing a value that is more likely to mask the error (thus producing a more predictable user experience) should be a goal, not something to avoid. We don't want to break people's websites on purpose to alert them that they made a mistake, this is not XML! And I highly doubt that using Infinity will make the author think "oh, I think I have a NaN!". It will likely just baffle them to no end, and make them conclude that they can't understand CSS.

@svgeesus
Copy link
Contributor

This is why we introduced the none keyword and the powerless component concept in CSS Color 4 to represent things like indeterminate hue angle, because NaN had the wrong behavior.

NaN is still used in most implementations, for doing calculations, but then the none keyword is used when parsing CSS syntax or serializing; and the prose says to convert it to 0 (not +∞) during colorspace conversion:

For all other purposes, a missing component behaves as a zero value, in the appropriate unit for that component: 0, 0%, or 0deg. This includes rendering the color directly, converting it to another color space, performing computations on the color component values, etc.

@tabatkins
Copy link
Member

tabatkins commented Feb 17, 2022

(For further history, the call that originally discussed this issue is minuted at https://lists.w3.org/Archives/Public/www-style/2014Apr/0373.html.)

This seems very random to me, why ∞ instead of e.g. -∞, 0, 1, or 3.14159?

Because the most likely cause of a NaN is division by 0, and the limit as you approach division by zero is either +Infinity or -Infinity, and +Infinity is more likely to be correct (CSS is usually doing math on positive values). (Edited: forgot that NaN only comes out of things like 0/0; plain ol' 1/0 is Infinity and totally normal.)

NaN can be produced via other mistakes, which can have a variety of limits, but since a NaN indicates a math error, preserving the authors precise intent wasn't considered too important; in these cases where the limit of the author's code is not Infinity, returning Infinity is the most likely value to produce a noticeably incorrect result.

This is why we introduced the none keyword and the powerless component concept in CSS Color 4 to represent things like indeterminate hue angle, because NaN had the wrong behavior.

No, this is a false summary of that discussion. We introduced none because NaN was completely inappropriate for the use-case. Conflating a missing component and a component that has a division by zero or other invalid math operation is completely wrong. NaN is used for missing channels in some JS color libraries because because its infectious behavior is slightly better/more recognizable if you forget to handle the case and naively do math on the channel (aka the "make it fuck up as badly as possible so authors notice the error" behavior, which Lea appears to be arguing is bad); using JS's null is more semantically appropriate (and more closely matches common JS practice in general), but less good in practice, since it coerces to 0 in many math operations and thus could hide the error. (source) CSS works differently in these regards, so the JS-based reasoning was irrelevant, and we could stick with common CSS design principles and use a keyword for the non-numeric value.

We don't want to break people's websites on purpose to alert them that they made a mistake, this is not XML!

We do this all the time, in many CSS features. For a relatively recent example that comes to mind, setting the grid positioning properties to a line name that doesn't exist will collapse them all to the first implicit line, which is almost certainly a bad position that will look wrong to the author. And as I mentioned in the preceding paragraph, breaking people's websites on purpose was the precise reason provided for JS color APIs to use NaN for missing channels.

(Heck, CSS's forward-compatible parsing, which drops the entire property when there's a mistake, is also an example of this.)

(Ah, I see I gave this same example last year.)

And I highly doubt that using Infinity will make the author think "oh, I think I have a NaN!". It will likely just baffle them to no end, and make them conclude that they can't understand CSS.

Do you think that using IACVT makes it more noticeable, understandable, or teachable? If so, can you elaborate on this reasoning?


FWIW I'm not necessarily opposed to changing the behavior here, but I am opposed to some of the arguments being deployed in service of this proposed change; they're wrong and don't reflect our general design philosophy elsewhere in CSS. There is a reason for the current behavior, and it's consistent with CSS's established design philosophy.

I don't recall if this was a specific reason for avoiding IACVT, but I'll note that you might not know the value is invalid at computed-value time; if a percentage is involved in producing a 0 division, it might not be apparent until used-value time when the % is finally resolved. So this would actually need to be a brand-new concept, and I'm not sure of the implications of that right now.

@Loirooriol
Copy link
Contributor Author

since a NaN indicates a math error, preserving the authors precise intent wasn't considered too important

Initial or inherited value isn't probably what they want either (otherwise they wouldn't be setting the property).

returning Infinity is the most likely value to produce a noticeably incorrect result.

Yes, but it may be going overboard. If we only tried to maximize incorrect results, we could also say that setting width to a calc() producing -1px also becomes the maximum value instead of being clamped to 0px.

you might not know the value is invalid at computed-value time

OK, good point about percentages. So rather than IACVT, I guess it would be that it "behaves as" the initial value or the inherited value (depending on whether the property is inherited).

@tabatkins
Copy link
Member

Yes, but it may be going overboard. If we only tried to maximize incorrect results, we could also say that setting width to a calc() producing -1px also becomes the maximum value instead of being clamped to 0px.

Good thing we're not only trying to maximize incorrect results, then. ^_^

Due to our "no open ranges" policy, we can assume that any explicitly authored values outside of a valid range are an authoring mistake, and go for the maximally incorrect result of "drop the property as invalid". As long as all ranges are closed, authors can't accidentally write a value close to the boundary that might be across the boundary in some implementations.

This isn't the case for calculations, which due to precision issues are possible to kick slightly outside of a valid range by accident. Catastrophizing these cases would be author-hostile when we can tell what the likely intent was, so we just clamp instead. And there's no particularly good reason to set some "maximum error" beyond which we give up and do something maximally incorrect, so we just clamp all values.

NaNs don't fall into this bucket - we have no means to predict what they're supposed to be (which is why they're NaN rather than an infinity or something). So nothing we do has even a slight assurance of matching author intent. So the only thing we can do is be maximally incorrect in some arbitrary way. Making the property invalid is out. I chose censoring to Infinity; reducing the whole value to initial or inherit is also possible.

CSS has existing precedent (already given) in cases like this to make the value deliberately, obviously bad. I don't think initial or inherit serves that purpose quite as well as an infinity. Custom properties need to use that behavior because they can be anything and be used anywhere; we have absolutely no idea what they are meant to substitute as, so we can't do anything smarter and obviously wrong. But for numbers we do have a more obviously wrong behavior available.

Like I said, I'm not against changing this in theory, but I do think the current behavior is fine and the suggested change would be slightly worse. I'm open to an argument that a different behavior would be better for authors, particularly in the recognizability/teachability aspect.

@LeaVerou
Copy link
Member

Do you think that using IACVT makes it more noticeable, understandable, or teachable? If so, can you elaborate on this reasoning?

IACVT is a CSS-wide concept that can, in the future, be recognized by dev tools and displayed similarly to parse-time invalid values. Handling NaN as infinity means it will never appear as an error in dev tools.

(I will reply to the rest when I have a little more time)

@tabatkins
Copy link
Member

IACVT is a CSS-wide concept that can, in the future, be recognized by dev tools and displayed similarly to parse-time invalid values. Handling NaN as infinity means it will never appear as an error in dev tools.

a) Remember, it won't be IACVT, but rather something similar that resolves at used-value time (after the invalid value was also passed down via inheritance, unlike IACVT which cleans up before inheritance)

b) I don't understand why you think DevTools can handle one but not the other. Both will turn into perfectly valid values at used-value time, with no author-observable remnants telling you that they originated from an invalid value. If DevTools can detect one had a NaN as its origin, it can detect the other.

@Loirooriol
Copy link
Contributor Author

I have realized that making the property behave as unset is not forwards-compatible regarding shorthand expansion, unless we are willing to add some var()-like approach for calc().

Consider background-position: calc(0px / 100% * 1px) 5px, and assume that during layout the percentage resolves to 0px, namely background-position: calc(NaN * 1px) 5px. Then we treat the entire property as unset, so it behaves as 0% 0%, i.e. 0px 0px.

But at some point we may turn background-position into a shorthand of background-position-{x,y}. Then, at parse time the example would become background-position-x: calc(0px / 100% * 1px); background-position-y: 5px. And the behavior would end up like background-position-x: 0px; background-position-y: 5px

@tabatkins
Copy link
Member

tabatkins commented Mar 22, 2022

The TAG just rereviewed, with this comment:

@LeaVerou, @atanassov and myself reviewed this issue during our Hybrid TAG f2f. We still have concerns about the current proposal that are best for the CSS WG to review, thus, we suggest another round of discussions there. In particular, we are concerned that imposing runtime errors through +infinity will break layouts in unpredictable ways. For the purposes of layout, calc expressions can be evaluated during used-value time which in turn can depend on user context. This can result in unpredictable and difficult to evaluate/debug problems.
Further, given JS has handling of NaN different than +infinity, this proposal will introduce inconsitency in the overall web platform. We should try and avoid that.

I still believe there is no alternative that is better than the current behavior.

Addressing these bits of feedback directly:

  • "we are concerned that imposing runtime errors through +infinity will break layouts in unpredictable ways."

    Any and all solutions share this property. There is literally no way to divine authorial intent in this situation; that's why it resolves to NaN in the first place rather than a number or an infinity. In particular, the other solution presented here (making the property "invalid at used-value time") will be just as divergent from the author's intent as anything else. Thus, I don't believe this is a reasonable critique. There's no way to "predictably" break the layout in cases like this.

  • "For the purposes of layout, calc expressions can be evaluated during used-value time which in turn can depend on user context. This can result in unpredictable and difficult to evaluate/debug problems."

    Again, I'm not sure how this is specific to the current spec text, rather than being a generic problem that will apply to any and all possible solutions. The timing of the NaN's appearance is the same regardless of what arbitrary action we take in response to it. Seeing a length go to infinity, as opposed to resetting the property to its initial value, seems equally easy (or difficult) to notice and debug. If anything, an infinity showing up might be easier to debug, as it more readily points to a math issue, as opposed to the initial value which can look like the property simply isn't being applied for some reason, which is a situation with many possible causes.

    (None of the options have any differences wrt dev tools, so this is purely a judgement call over how the behavior presents itself in the page.)

  • "Further, given JS has handling of NaN different than +infinity, this proposal will introduce inconsitency in the overall web platform. We should try and avoid that."

    Yet again, no solution we have available to us is different in this regard. JS "deals with NaN" by making it infectious across math operations, and eventually causing an API to throw, stringifying it (to confusing results, possibly also resulting in an API throwing), or doing some other arbitrary action as a response to the unexpected input (such as following a codepath where a comparison returns false, as all the comparison operators return false when at least one arg is NaN). There is no consistency in any regard here; throwing a NaN into your JS code is very likely to cause something to break somewhere, but exactly when and where that happens is completely unpredictable.

    CSS NaN is also infectious to the same degree - across math operations. CSS can't perform the equivalent of "throwing" upon receiving a NaN, as it has to keep laying out the page, but it can do something at the value level to censor it, which all proposed solutions do (just with different censoring results).


On review, I find the feedback to be unactionable. There is no option that does not run equally afoul of every complaint listed, and to my judgement, every discussed option is equally bad or worse. CSS already has several precedents for handling unrecoverable logic errors with no discernable author intent by making them broken in a way that's hopefully obvious to the author, and I believe the current spec comports itself handily among those precedents. Introducing a new type of IACVT (but at used-value time) would not make the situation materially better in any way, and I believe would make it slightly worse in some ways (importantly, in ease of deducing where the problem lies, imo).

Agenda+ to suggest resolving to close this issue as no change.

@dbaron
Copy link
Member

dbaron commented Mar 29, 2022

As someone who's spent a bit of time over the past year debugging assertion failures (DCHECK failures) that involved NaN values being introduced into code that didn't expect them (very little code does, really), I'd like add one note about NaN values:

The typical sources of NaN values are more diverse than portrayed above. Sources that I've seen include (with variations in sign):

  • 0.0 / 0.0
  • 0.0 * Infinity
  • sin(Infinity), cos(Infinity), tan(Infinity), etc.
  • Infinity - Infinity (which can come from Infinity - floor(Infinity))

This, of course, assumes IEEE 754 (a.k.a. IEC 559) floating point behavior. Note that this is not guaranteed by C++, though I think many browser implementations written in C++ assume it. In C++ it can be tested with std::numeric_limits<float>::is_iec559, std::numeric_limits<double>::is_iec559, etc. Most of these things are formally undefined behavior in C++, as I understand things.


As far as the proposed solutions go, I'd note that I weakly dislike introducing Infinity values where not necessary, since Infinity values (see a few paragraphs up) have a strong tendency to produce NaN values further down the line, which tend to lead to more bugs than the Infinity values alone do. Once you have NaN values, they tend to break a lot of developer assumptions (e.g., again assuming IEEE 754 / IEC 559), such as breaking the assumption that one of a < b, a == b, and a > b is true. On the other hand, implementation should be avoiding bugs in these cases since there are other ways to produce Infinity values -- it's just sending more cases down codepaths that are likely to be buggy.

(For example, with transform matrices, the easiest way to get NaNs is to multiply a bunch of very large numbers until you exceed the range of the relevant floating point type and get an Infinity, and then multiply that by a 0. Most components of the transform matrix are usually zero, so once you have infinities in transform matrices you very quickly get NaNs, and then you're often just a few multiplies away from a matrix that's just all NaNs.)

@tabatkins
Copy link
Member

Once you have NaN values, they tend to break a lot of developer assumptions (e.g., again assuming IEEE 754 / IEC 559), such as breaking the assumption that one of a < b, a == b, and a > b is true.

Note that we don't (yet) have direct comparisons, and the functions that do comparison-like things (clamp/min/max) all evaluate to NaN if they have a NaN argument (see the final bulleted item in the list of division by zero consequences in Values 4, section 10.9 Type Checking, so at least we currently avoid the most obvious NaN confusions.

@miragecraft
Copy link

miragecraft commented Apr 4, 2022

I just noticed that Chrome v100 implemented this behavior for calc(1px/0) (by error?) and found out that this is the intended behavior for the draft spec.

I strongly disagree with this decision, because I actually built a CSS grid system which depend on divide by zero causing calc() to be invalid, thereby setting the property to its initial value.

See here: http://miragecraft.com/projects/razorgrid.html

Setting --span to -1 causes my flip switch which is calc((var(--span)+1)/(var(--span)+1)) to become invalid, while all other values causes it to stay at 1 (which I use to multiply the value I want, leaving it unchanged), this lets me change flex-basis to auto only when --span: -1 without affecting calc() result for all other --span values.

So as you can see, divide by zero causing calc() to be invalid is actually a useful behavior, while making the value go to infinity is not.

So the argument that going to infinity is picking the most blatant of two wrongs is incorrect, because it's possible - as I just demonstrated - that a developer can intentionally divide a value by zero to force the property to go to the initial value.

@LeaVerou
Copy link
Member

@miragecraft This is not the behavior we are discussing in this issue, because dividing by 0 does not produce NaN, it produces infinity (or -infinity, if dividing a negative number by 0). The only case where dividing by 0 produces NaN is when dividing 0/0.

@miragecraft
Copy link

miragecraft commented Apr 4, 2022

@LeaVerou I see, so I was mistaken regarding how widespread this issue is, however I AM actually using the 0/0 behavior.

Because my flip switch depend on making the result 1 for non-trigger values, so I have to keep the divider and divisor the same.

I'm repeating it here again: calc((var(--span)+1)/(var(--span)+1)).

So in the case when --span:-1 it will cause calc() to result in calc(0/0).

It has to be 0/0 because it needs to resolve to 1 for non-zero inputs.

@Loirooriol
Copy link
Contributor Author

@miragecraft It seems to me you actually want conditionals like

--x-shrink: if(var(--s) = -1; initial; 1);

Something like this was proposed in #5009 (comment), but of course this thing doesn't exist yet.

At first I thought that just using sign() would do the trick, arithmetic conditionals are precisely why I proposed it in #4673

--x-shrink: abs(sign(var(--s) + 1));
flex-basis: calc(var(--auto-span) * var(--x-shrink) - 1px * (1 - var(--x-shrink)));

but the problem is that calc() will just clamp the deliberately invalid -1px into 0px instead of invalidating flex-basis.

So you will need registered properties:

@property --flex-basis {
  syntax: "<length>";
  inherits: false;
  initial-value: -1px;
}
#target {
  --x-shrink: abs(sign(var(--s) + 1));
  --flex-basis: calc(var(--auto-span) * var(--x-shrink) - 1px * (1 - var(--x-shrink)));
  flex-basis: var(--flex-basis);
}

Also, Chromium has not implemented sign() yet :(

@property --x-shrink {
  syntax: "<number>";
  inherits: false;
  initial-value: calc(1px / 0);
}
@property --x-shrink-2 {
  syntax: "<number>";
  inherits: false;
  initial-value: calc(1px / 0);
}
@property --flex-basis {
  syntax: "<length-percentage>";
  inherits: false;
  initial-value: -1px;
}
#target {
  --x-shrink: calc((var(--s) + 1) / (var(--s) + 1));
  --x-shrink-2: max(0, var(--x-shrink) * -1 + 2);
  --flex-basis: calc(var(--auto-span) * var(--x-shrink-2) - 1px * (1 - var(--x-shrink-2)));
  flex-basis: var(--flex-basis);
}

So I think your usecase is still doable, but the lack of conditionals sucks. Also, Firefox and WebKit haven't implemented registered properties...

@miragecraft
Copy link

@Loirooriol Yes you're absolutely right, if conditionals are available in CSS then I don't have to essentially hack calc(0/0) to flip property values.

But they aren't so I do.

@miragecraft
Copy link

miragecraft commented Apr 5, 2022

I was able to workaround this problem, instead of invalidating calc() via divide by zero, I made calc() resolve to negative value and invalidated the flex-basis property instead. This technique can also be used for any property that will be invalidated if set to a negative value (such as width and height).

Some properties can't be invalidated in this way, such as margin, but instead of invalidating I can make it resolve to 0 via creative use of max() and min().

@Loirooriol
Copy link
Contributor Author

That's what I depicted in my 2nd code block, but calc() clamps:

Parse-time range-checking of values is not performed within math functions, and therefore out-of-range values do not cause the declaration to become invalid. However, the value resulting from an expression must be clamped to the range allowed in the target context.

How did you workaround that without using registered custom properties?

@miragecraft
Copy link

miragecraft commented Apr 5, 2022

Edit: I was wrong.

@miragecraft
Copy link

@Loirooriol You're absolutely right, it didn't invalidate flex-basis, I was mistaken. It went to zero like the spec stated. I just didn't notice it right away.

I decide to invalidate it by using a string value instead --span:wrap which did finally invalidate the calc() statement. Of course it could have been any string like --span:jibberish but it's better to be descriptive.

@smfr smfr changed the title [css-values] Make top-level NaN be IACVT instead of ∞ [css-values] Make top-level NaN be invalid at computed value time instead of ∞ Apr 13, 2022
@css-meeting-bot
Copy link
Member

The CSS Working Group just discussed [css-values] Make top-level NaN be invalid at computed value time instead of ∞.

The full IRC log of that discussion <fantasai> topic: [css-values] Make top-level NaN be invalid at computed value time instead of ∞
<fantasai> github: https://github.com//issues/7067
<fantasai> TabAtkins: When we added the ability to do arbitrary unit math to calc(), we introduced the ability to divide by zero
<fantasai> TabAtkins: these turn into infinities, fine, handle as large number
<fantasai> TabAtkins: We also introduced Nans, though
<fantasai> TabAtkins: We don't now how to make those behave
<fantasai> TabAtkins: We can't know what author intended without inspecting code
<fantasai> TabAtkins: so need to do something arbitrary to Nan
<fantasai> TabAtkins: Current spec turns NaN into infinity if it escapes a calc() function
<fantasai> TabAtkins: I chose this because a) Nan is definite an authoring error, always a mistake
<fantasai> TabAtkins: b) infinite is the most likely value to look wrong if you're doing something with this
<fantasai> TabAtkins: We have an unofficial policy throughout CSS, whenever we have a situation that can't detect at parse time, but is obviously a mistake
<lea> q?
<lea> q+
<fantasai> TabAtkins: we make it break in an obvious way, so that author is more likely to notice the mistake
<emilio> q+
<fantasai> TabAtkins: rather than doing something more sublte, that might be missed
<fantasai> TabAtkins: This came up during TAG review, and Lea suggested that it's better handled as "invalid at computed value time" behavior
<fantasai> TabAtkins: I don't have a strong opinion on this
<fantasai> TabAtkins: my only objection to changing behaving time is that it won't be invalid at computed value time, might not know it's invalid until *used* value time
<Rossen_> q
<fantasai> TabAtkins: so similar to IVACT, but distinct
<fantasai> TabAtkins: And whatever effect it has would behave differently wrt inheritance, for example
<fantasai> TabAtkins: The other reason is that I think IVACT behavior is less likely to be noticeable
<fantasai> TabAtkins: It's less likely to be obviously wrong, compared to treating as infinity
<fantasai> TabAtkins: I think Nan should break things as much as we possibly can
<fantasai> TabAtkins: and would be useful for authors to give them the more broken rendering
<fantasai> TabAtkins: but open for discussion
<emilio> ack fantasai
<Rossen_> ack fantasai
<Rossen_> ack lea
<fantasai> lea: I don't remember most of the arguments from TAG review, but think there's a more fundamental principle here
<fantasai> lea: errors don't always arise from authoring, sometimes from user-generated or other variable input
<fantasai> lea: and user will be first to see this
<fantasai> lea: I want to avoid making the website undreadable
<fantasai> lea: For things like layout, can be very bad
<lea> q?
<fantasai> lea: I'm not opposed, but problem with infinity because it would introduce really bad breakage, would be seen by users first
<Rossen_> q?
<oriol> q+
<fantasai> TabAtkins: I don't agree that invalid at computed value time is less likely to break
<Rossen_> q+
<fantasai> TabAtkins: if you're specifying a width as a calc(), that's going to be some specific value. If at some point, because using variables, results in a Nan and becomes invalid at computed time, that'll switch to 'auto'
<fantasai> TabAtkins: which is just as completely different from author's intent as infinity is
<fantasai> TabAtkins: we cannot divine an intent from a Nan
<miriam> q+
<fantasai> lea: Width: auto might produce a layout that's not intended, but not the same level as breakage as infinity
<fantasai> lea: which can stretch out of the viewport and make content unreadable
<fantasai> emilio: I was going to point out what Tab pointed out, that we still have an issue of what happens if NaN happens at used value time
<fantasai> emilio: I'm not a fan of introducing behavior that depends on cascade before/after
<fantasai> emilio: if we detect NaN at computed value or used value time...
<fantasai> emilio: As for what specific behavior, I don't have a super strong opinion
<fantasai> emilio: When we define NaNs in ..., we normalize to zero.
<fantasai> emilio: Usually just happens in testcases
<Rossen_> ack emilio
<iank_> we can only really determine this stuff at used value time.
<fantasai> emilio: I don't think it will matter in practice
<fantasai> emilio: I think zero is a bit more reasonable, doesn't create infinite scrollbars
<fantasai> emilio: I think I'm not a fan of the 2 different behaviors based on when you find the NaN
<miriam> q-
<fantasai> oriol: I'm not a big fan of the currently-specified behavior of treating NaN as positive infinity, which seems arbitrary to me
<fantasai> oriol: That's why I filed this issue to try to choose something closer to invalid at computed value time
<fantasai> oriol: During discusison realized this would be a problem for properties that have multiple components
<fantasai> oriol: Maybe only one of these is NaN
<fantasai> oriol: And sometimes we turn these properties into shorthands,
<TabAtkins> If it's infinity specifically that bugs people, I can go with "treat as 0".
<Rossen_> ack oriol
<fantasai> oriol: If we invalidate the entire property now, and later turn it into a shorthand, then we would only invalidate specific longhands
<TabAtkins> Just as likely to be wrong, but less likely to pop scrollbars.
<fantasai> oriol: To prevent this creating a change in behavior
<fantasai> oriol: we would need some kind of ???
<lea> I think "treat as 0" is more reasonable. width: 0; overflow: hidden would be problematic, but hopefully that'd be rare
<fantasai> oriol: So now I'm more aligned with choosing a specific number for NaN instead of an approach similar to IVACT
<fantasai> oriol: I'm not sure infinity is the best, maybe zero or minus infinity
<fantasai> Rossen_: I think that during our discusison, and reason to revisit, the compounding effect of infinity can have very unpredictable results based on current user settings, viewport size, fonts available, etc.
<fantasai> Rossen_: that will make detectability of such errors difficult for authors, since spurious at best
<fantasai> Rossen_: Given we're out of time, don't want to force a resolution
<fantasai> Rossen_: but want to really go back and continue and see if we can agree on something more platform friendly
<fantasai> TabAtkins: in IRC, discussiong changing to zero
<fantasai> TabAtkins: I'm fine with that, Lea's fine with it
<chris> zero looks better to me
<fantasai> Rossen_: Ok, let's return to that next week

@tabatkins
Copy link
Member

We ran out of time, but at the end of the discussion I suggested just censoring NaN to 0, if it's infinity specifically that's the issue (and not the more general censoring approach). It's just as likely to break a page, tho in a way that's possibly less disruptive (likely less scrollbars!), and it retains the simplicity of the current censoring, as opposed to the complexity of trying to define IACVT for used-value time.

This seemed to make a lot of people in IRC happier, so for next week, proposed resolution is we change the NaN censoring behavior to turn it into a 0.

@LeaVerou
Copy link
Member

As I said in IRC, I think 0 is a lot more reasonable and less likely to introduce horrible breakage.

@plinss
Copy link
Member

In general, I'd prefer some kind of behavior that makes a NaN calc() invalid so that regular fallback can happen, but I accept that may not be doable with calc().

I'm OK with 0 instead of infinity (infinity really bothers me because it completely different behavior for NaN than used anywhere else), but also had a thought about providing an optional fallback within the calc() itself, e.g. calc(--foo / --bar, 0) where the default value of the fallback would be 0. This would allow authors who know what they're doing (and understand that NaN can happen to their calc() provide the fallback that makes the most sense for their situation.

@Loirooriol
Copy link
Contributor Author

Being able to detect NaNs seems useful indeed, but I think this should be addressed more generally with conditionals (#5009 (comment)), e.g.

width: if(isNaN(0px / 100%); 100px; 200px);

but also

width: if(100% < 50px; 100px; 200px);

@LeaVerou
Copy link
Member

In general, I'd prefer some kind of behavior that makes a NaN calc() invalid so that regular fallback can happen, but I accept that may not be doable with calc().

I'm OK with 0 instead of infinity (infinity really bothers me because it completely different behavior for NaN than used anywhere else), but also had a thought about providing an optional fallback within the calc() itself, e.g. calc(--foo / --bar, 0) where the default value of the fallback would be 0. This would allow authors who know what they're doing (and understand that NaN can happen to their calc() provide the fallback that makes the most sense for their situation.

Would that also work for cases where the calculation result is IACVT or only for NaN? That would be quite useful for providing fallbacks for entire calculations without having to assign to a pointless variable.

@tabatkins
Copy link
Member

IACVT doesn't trigger at the individual value level; it wipes out the entire property. I don't want to try and complexify things at that point; assigning to a throwaway custom property does the job fine enough.

Being able to detect NaNs seems useful indeed, but I think this should be addressed more generally with conditionals

Agreed. We could even special-case a literal NaN there so you can just test for equality directly (but probably continue to use IEEE-754 semantics for comparisons between two calculations that both result in a NaN).

@plinss
Copy link
Member

Would that also work for cases where the calculation result is IACVT or only for NaN? That would be quite useful for providing fallbacks for entire calculations without having to assign to a pointless variable.

It was just a random thought, and I didn't bother to dig into all the ramifications. At first blush, I'd expect it to operate like the fallback value on var(), e.g. any time the calc() fails to produce a value, use this instead. But that breaks the 'default the fallback value to 0' as all invalid calc()s would start turning in to 0s. So if we went this route I think we'd either have to either make it only apply to NaN, or make the fallback only default to 0 for a NaN (which seems weird) and a non-specified fallback would be invalid otherwise (tho a specified fallback would apply in either case).

@emilio
Copy link
Collaborator

A precedent for 0 is in https://webidl.spec.whatwg.org/#abstract-opdef-converttoint, fwiw.

@Loirooriol
Copy link
Contributor Author

https://tc39.es/ecma262/#sec-tointegerorinfinity also chooses 0 instead of infinity.

@css-meeting-bot
Copy link
Member

The CSS Working Group just discussed [css-values] Make top-level NaN be invalid at computed value time instead of ∞, and agreed to the following:

  • RESOLVED: NaN escaping calc() tree is coerced to zero
The full IRC log of that discussion <fantasai> Topic: [css-values] Make top-level NaN be invalid at computed value time instead of ∞
<fantasai> github: [css-values] Make top-level NaN be invalid at computed value time instead of ∞
<astearns> github: https://github.com//issues/7067
<fantasai> TabAtkins: last week we discussed behavior of censoring NaNs when they escape calc()
<fantasai> TabAtkins: previously specced as infinity
<fantasai> TabAtkins: discussion seemed to conclude that censoring to zero would be acceptable
<fantasai> TabAtkins: Since then, comments in issue supporting that interpretation, for a variety of good reasons
<fantasai> TabAtkins: This works well enough for my purposes, so suggest to resolve that NaN resolves to zero when escaping a calc() tree
<fantasai> iank_: can you elaborate more?
<fantasai> iank_: Inside a calc() tree, if you've got something relatively nested that has 1/0 or something that will be a NaN
<fantasai> iank_: is that where it is coerced to zero?
<fantasai> iank_: or will it propagate up
<fantasai> TabAtkins: it propagates up, infecting most of the tree following IEEE rules
<fantasai> TabAtkins: almost the same as JS also
<fantasai> TabAtkins: as soon as you escape the top-level function, properties don't know how to deal, so turn it into zero
<fantasai> iank_: ok, thanks
<TabAtkins> s/1/0/
<tantek> sounds reasonable
<fantasai> RESOLVED: NaN escaping calc() tree is coerced to zero
<lea> +1

@tabatkins
Copy link
Member

I'm pretty sure we have some NaN tests (I believe I wrote some) and they'll need to be changed to match the resolution.

moz-wptsync-bot pushed a commit to web-platform-tests/wpt that referenced this issue Aug 17, 2022
Fix some tests to:

 * Not assume `double` precision.
 * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883

bugzilla-url: https://bugzilla.mozilla.org/show_bug.cgi?id=1682444
gecko-commit: bf0c0ff13cc1586a185bb0d9f39a9bbf6be554c3
gecko-reviewers: boris
moz-v2v-gh pushed a commit to mozilla/gecko-dev that referenced this issue Aug 18, 2022
Fix some tests to:

 * Not assume `double` precision.
 * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
moz-v2v-gh pushed a commit to mozilla/gecko-dev that referenced this issue Aug 18, 2022
Fix some tests to:

 * Not assume `double` precision.
 * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
moz-wptsync-bot pushed a commit to web-platform-tests/wpt that referenced this issue Aug 18, 2022
Fix some tests to:

 * Not assume `double` precision.
 * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883

bugzilla-url: https://bugzilla.mozilla.org/show_bug.cgi?id=1682444
gecko-commit: bf0c0ff13cc1586a185bb0d9f39a9bbf6be554c3
gecko-reviewers: boris
webkit-commit-queue pushed a commit to nt1m/WebKit that referenced this issue Jul 9, 2023
…nity)

https://bugs.webkit.org/show_bug.cgi?id=259034
rdar://111984451

Reviewed by Darin Adler.

This changed in w3c/csswg-drafts@b5113b9

Discussion was at w3c/csswg-drafts#7067

* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-computed-color-function-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-computed-lab-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-valid-color-function-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-valid-lab-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/calc-infinity-nan-computed-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/round-mod-rem-serialize-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/signs-abs-computed-expected.txt:
* Source/WebCore/css/calc/CSSCalcOperationNode.h:
* Source/WebCore/css/calc/CSSCalcValue.cpp:
(WebCore::CSSCalcValue::clampToPermittedRange const):

Canonical link: https://commits.webkit.org/265891@main
mnutt pushed a commit to movableink/webkit that referenced this issue Jul 18, 2023
…nity)

https://bugs.webkit.org/show_bug.cgi?id=259034
rdar://111984451

Reviewed by Darin Adler.

This changed in w3c/csswg-drafts@b5113b9

Discussion was at w3c/csswg-drafts#7067

* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-computed-color-function-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-computed-lab-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-valid-color-function-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-color/parsing/color-valid-lab-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/calc-infinity-nan-computed-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/round-mod-rem-serialize-expected.txt:
* LayoutTests/imported/w3c/web-platform-tests/css/css-values/signs-abs-computed-expected.txt:
* Source/WebCore/css/calc/CSSCalcOperationNode.h:
* Source/WebCore/css/calc/CSSCalcValue.cpp:
(WebCore::CSSCalcValue::clampToPermittedRange const):

Canonical link: https://commits.webkit.org/265891@main
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 25, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 25, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 25, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 25, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 25, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Sep 28, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Loirooriol pushed a commit to Loirooriol/servo that referenced this issue Oct 2, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
github-merge-queue bot pushed a commit to servo/servo that referenced this issue Oct 2, 2023
Fix some tests to:

  * Not assume `double` precision.
  * Account for recent working group resolution with regards to NaN: w3c/csswg-drafts#7067 (comment)

Not sure I caught all, but normalizing to 0 was already our existing
behavior. This feature needs more work before it can be enabled more
generally, so make it nightly-only, for now.

Also, it's unclear per spec what the serialization for infinity*1s or so
should be. Right now we serialize to <very-big-number>s, which seems
reasonable, but some tests (but not others!) expect different behavior.

I left those untouched for now.

Differential Revision: https://phabricator.services.mozilla.com/D154883
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants