Tuesday, July 3, 2018

A one word change to the C standard to make undefined behavior sane again

A lot has been written on C undefined behavior, some of it by myself and a lot more by people who know a lot more about compilers than I do. However, I now believe that a seemingly innocuous but far-reaching change to the standard has given permission for the current craziness, and I think undoing that change could be a start in rectifying the situation.

Proposal

In section 3.4.3, change the word "possible" back to "permissible", the way it was in C89.

Background

In all versions of the standard I have checked, section 3.4.3 defines the term "undefined behavior".
undefined behavior
behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for which this International Standard imposes no requirements

So that seems pretty clear, the compiler can do whatever it wants. But wait, there is a second paragraph that clarifies:

Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message).
So it's not a free-for-all, in fact it is pretty clear about what the compiler is and is not allowed to do, as there are essentially three options:
  1. It "ignores" the situation completely, so if the CPU hardware produces an overflow or underflow on an arithmetic operation, well that's what you get. If you write to a string constant, the compiler emits the write and either the string constant might get changed if there is no memory protection for string constants or you might get a segfault if there is.
  2. It "behaves in a manner characteristic of the environment". So no "demons flying out of your nose" nonsense, and no arbitrary transformations of programs. And whatever you do, you have to document it, though you are not required to print a diagnostic.
  3. It can terminate with an error message.
I would suggest that current behavior is not one of these three, and it's not in the range bounded by these three either. It is clearly outside that defined range of "permissible" undefined behavior.

But of course compiler writers have an out, because more recent versions of the standard changed to word "permissible", which clearly restricts what you are allowed to do, to "possible", which means this is just an illustration of what might happen.

So let's change the word back to "permissible".

9 comments:

Shafik Yaghmour said...

That second paragraph is labeled as a "Note" which means that it is non-normative ( informational) and do not contain requirements so they are not binding on the implementation.

See this Stack Overflow answer https://stackoverflow.com/a/21364467/1708801 which quotes the relevant ISO/IEC directive.

Shafik Yaghmour said...

Interesting it looks like C89 did not have notes, but a lot of content was moved to notes in C99. Though the intent looks like this paragraph was not meant to be binding.

Marcel Weiher said...

Well, speaking of a range of "permissible" things does sound fairly binding to me.

https://www.merriam-webster.com/dictionary/permissible

YMMV.

Peter De Wachter said...

I don't think that permissible/possible is the problem. "Ignoring the situation completely with unpredictable results" is precisely what current compilers do: they apply a series of program transformations, each preserving all defined behavior, and ignoring any undefined behavior.

Consider how "deleted NULL tests" happen. The compiler sees a pointer dereference. If that pointer were NULL, the program would have undefined behavior, so the compiler can ignore that situation completely, even following the C89 wording. If the compiler then sees a comparison against NULL, it considers all situations that the standard requires it to consider, and it sees that in those situations the comparison is always false.

That's how that phrase is understood.

mcherm said...

Believe it or not, the compiler writers are not malicious operators out to get you. They have not designed their compilers to make demons fly out of your nose simply in order to teach those careless programmers a lesson about careful attention to standards.

C (and, to be fair, nearly every programming language) purports to operates on a very simple machine: one in which some CPU performs steps one at a time, reading from and writing to variables (stored in main memory) before and after each step. You can still purchase CPUs that work this way, but no modern computer does.

In a modern computer, it takes hundreds to thousands of times longer to read data from memory and write it back as it does to perform a step of a calculation. And doing just one step of a calculation is terribly wasteful when pipelining and other parallellism can perform many steps simultaneously. And that's just on a single core -- even your phone likely is running multiple cores. To make these things run thousands of times faster, we build layer after layer of caching and cleverly pack multiple commands into pipelines that can be executed together.

To take an analogy that one can understand without knowing anything about modern CPUs, imagine writing an algorithm to sort an array. Now imagine the change in how that algorithm would perform if you ran it as written versus if you performed a bounds check before every read or write. Your code might spend 80% of its time performing bounds checks, or (to put it differently) run 5 times faster without the checks.

Now, compiler-writers cannot eliminate most of the "checks" that they use to ensure that the complex web of caches and pipelining we use is producing the results that we desire. But they can eliminate a lot of them. Doing so requires making assumptions. And the C spec provides the assumptions that they can make -- specifically, that programmers are not doing any of the things that are listed as "undefined" behavior.

If they ARE doing those things, the compilers do not actually go out of their way to program in flying nose demons. Instead, what they do is to ignore the problem -- one of the specific actions you propose that the spec permit. The problem is that you are wrong when you say that ignoring the problem will clearly and obviously lead to simple, understandable results (like changing the string constant or segfaulting). In fact, ignoring it can lead to changes like modification of a different value in memory -- the compiler writer doesn't have a way to know just what that other value does, but given how this programmer writes, it's probably the flag that controls the daemon behavior and setting it to "3" will direct the daemons out through your nose.

In other words, changing the spec would NOT have the effect you desire of making modern computer hardware behave in a simple, straightforward fashion. It MIGHT have the effect of drastically slowing down properly written programs in order to slightly constrain the kinds of errors we could get from running invalid programs.

victor yodaiken said...

This is essentially the proposal I have been circulating for the last couple of months.

https://docs.google.com/document/d/1xouelPcphQ-o7DmdSwz5UcL42M6bdA3t93Nm_5Hbomc/edit?usp=sharing

victor yodaiken said...

mcherm repeats some common justifications for the current sad state of C undefined behavior, but those are wrong.

Modern CPUs don't have any behavior that would have shocked CPU designers of the 1970s or, really, even before. In partcular, pipelining was common and even used in the PDP11 processors. The C abstract machine is totally compatible with pipelines, OO, caches, speculation and with standard techniques of compiler optimization like peephole optimization, constant reduction etc. etc. None of these things are new discoveries.

Nothing abuot pipelining or any other feature of processors requires bounds checking. In fact, the original purpose of the C undefined behavior provision was to license compilers to *ignore* bounds checking, which was left as a responsibility of programmers.

It is wrong to claim that our choice is between the compiler *preventing* undefined behavior and the compiler abandoning the C semantics when it enounters undefined behavior. Nobody is asking compilers to transform C into a safe language. Everyone understands that e.g. writing past an array boundary might cause unpredictable program behavior - what they are insisting on is that the compiler does not create surprises. It is really irritating to have people consistently attempt to defend the misbehavior of C compilers by (incorrectly) explaining some elementary aspect of processor design or compilation.

Marcel Weiher said...

Thank you, Victor, my thoughts exactly.

@mchen: I know a tiny little bit about optimization (hint: check out the book!), and about CPUs. I've been programming in C for well over 30 years. What you write has absolutely nothing to do with the things compilers do these days, and nothing to do with the hardware. The 68K I wrote my first C programs on had two's complement arithmetic just like modern CPUs do, and I even wrote code on our school's PDP-11.

No, C does not do bounds checks. It is modern optimizing compilers that check the bounds and eliminate the code if the last iteration goes outside bounds. Including the parts of the loop that do not go out of bounds. It is modern C compilers that eliminate bounds checks that the programmer put in, because they "proved" that the accesses cannot go out of bounds. Incorrectly, it turns out.

And no, those compilers are not "ignoring" the undefined behavior. Ignoring it would leave the out-of-bounds access in, and the rest of the loop, and it would also leave the bounds-check in. What they are doing is actively exploiting undefined behavior.

And no, there is no evidence that doing so has any noticeable positive effect on performance, except in contrived benchmark situations, and even there it's nothing to write home about. See http://www.complang.tuwien.ac.at/kps2015/proceedings/KPS_2015_submission_29.pdf

Also compare Proebsting's law: the overall contribution of compiler technology to performance is not all that overwhelming in the first place, and most of the gains come from the low-hanging fruit.

Victor Yodaiken said...

Peter De Wachter:
This is a situation where it's easy to make something seem plausible, even when it is false. The compiler is not igoring the previous null use, it is deducing unsoundly

deference of pointer -> pointer is not null

There is nothing in the standard that supports that deduction. The standard specifies how if(p == null ){... }else{...} should work and the compiler is ignoring that specification because some other code element has undefined behavior. The standard permits omission of calculation only when the calculation is redundant and has no side effects. In this case, no calculation was done previously and there may well be a side effect.

Code: ... if(p== null) ... if(p == null) ...

can be optimized per standard because this is a redundant computation with no side effects. However

Code: ... p->x; .... if(p== null) ...

does not meet the requirements. The idea that undefined behavior is somehow impossible makes no sense at all.