Query: Would the compiler writers have to write compilers from scratch for C++20 or the features can be added without overwriting the entire system?
I am aware that new features means a few parts needs be rectified. But does the whole need to change?
And I agree for C++ compiler writers to be gods. Looking at the source code for GCC/clang makes me doubt my programming skills. Same goes for the linux kernel source code.
If the parser is hard to write, that means the syntax is conditional upon previous statements, which I've not seen in the spec. As long as the syntax is Lexical, no real issues parsing. How does the context throw curves?
This is either an expression statement computing the product of `foo` and `bar` or the declaration of a variable `bar` as a `foo` pointer.
To solve this ambiguity, you need context: if `foo` is a type available in the current scope then it's a variable declaration, otherwise an expression.
Yes. May need to back track due to context sensitivity. A quote from the source comments:
Some C++ constructs require arbitrary look ahead to disambiguate.
For example, it is impossible, in the general case, to tell whether
a statement is an expression or declaration without scanning the
entire statement.
I am sure there must be some devilish source code examples that require an exponential search for the correct parse. Also, has anyone constructed a modern C++ parser in bison which can backtrack?
Making sure that features are implementable on existing compilers (without major changes to the AST or breaking the ABI for example), is a significant concern for the committee, and often features get dropped for this reason.
I am aware that new features means a few parts needs be rectified. But does the whole need to change?
And I agree for C++ compiler writers to be gods. Looking at the source code for GCC/clang makes me doubt my programming skills. Same goes for the linux kernel source code.