Skip to content

Commit

Permalink
Auto merge of rust-lang#80790 - JohnTitor:rollup-js1noez, r=JohnTitor
Browse files Browse the repository at this point in the history
Rollup of 10 pull requests

Successful merges:

 - rust-lang#80012 (Add pointing const identifier when emitting E0435)
 - rust-lang#80521 (MIR Inline is incompatible with coverage)
 - rust-lang#80659 (Edit rustc_ast::tokenstream docs)
 - rust-lang#80660 (Properly handle primitive disambiguators in rustdoc)
 - rust-lang#80738 (Remove bottom margin from crate version when the docs sidebar is collapsed)
 - rust-lang#80744 (rustdoc: Turn `next_def_id` comments into docs)
 - rust-lang#80750 (Don't use to_string on Symbol in rustc_passes/check_attr.rs)
 - rust-lang#80769 (Improve wording of parse doc)
 - rust-lang#80780 (Return EOF_CHAR constant instead of magic char.)
 - rust-lang#80784 (rustc_parse: Better spans for synthesized token streams)

Failed merges:

 - rust-lang#80785 (rustc_ast_pretty: Remove `PrintState::insert_extra_parens`)

r? `@ghost`
`@rustbot` modify labels: rollup
  • Loading branch information
bors committed Jan 7, 2021
2 parents 8f0b945 + 695f878 commit c8915ee
Show file tree
Hide file tree
Showing 43 changed files with 607 additions and 372 deletions.
2 changes: 1 addition & 1 deletion compiler/rustc_ast/src/token.rs
Original file line number Diff line number Diff line change
Expand Up @@ -771,7 +771,7 @@ impl fmt::Display for NonterminalKind {
}

impl Nonterminal {
fn span(&self) -> Span {
pub fn span(&self) -> Span {
match self {
NtItem(item) => item.span,
NtBlock(block) => block.span,
Expand Down
33 changes: 17 additions & 16 deletions compiler/rustc_ast/src/tokenstream.rs
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
//! # Token Streams
//!
//! `TokenStream`s represent syntactic objects before they are converted into ASTs.
//! A `TokenStream` is, roughly speaking, a sequence (eg stream) of `TokenTree`s,
//! which are themselves a single `Token` or a `Delimited` subsequence of tokens.
//! A `TokenStream` is, roughly speaking, a sequence of [`TokenTree`]s,
//! which are themselves a single [`Token`] or a `Delimited` subsequence of tokens.
//!
//! ## Ownership
//!
//! `TokenStream`s are persistent data structures constructed as ropes with reference
//! counted-children. In general, this means that calling an operation on a `TokenStream`
//! (such as `slice`) produces an entirely new `TokenStream` from the borrowed reference to
//! the original. This essentially coerces `TokenStream`s into 'views' of their subparts,
//! the original. This essentially coerces `TokenStream`s into "views" of their subparts,
//! and a borrowed `TokenStream` is sufficient to build an owned `TokenStream` without taking
//! ownership of the original.
Expand All @@ -24,9 +24,9 @@ use smallvec::{smallvec, SmallVec};

use std::{fmt, iter, mem};

/// When the main rust parser encounters a syntax-extension invocation, it
/// parses the arguments to the invocation as a token-tree. This is a very
/// loose structure, such that all sorts of different AST-fragments can
/// When the main Rust parser encounters a syntax-extension invocation, it
/// parses the arguments to the invocation as a token tree. This is a very
/// loose structure, such that all sorts of different AST fragments can
/// be passed to syntax extensions using a uniform type.
///
/// If the syntax extension is an MBE macro, it will attempt to match its
Expand All @@ -38,9 +38,9 @@ use std::{fmt, iter, mem};
/// Nothing special happens to misnamed or misplaced `SubstNt`s.
#[derive(Debug, Clone, PartialEq, Encodable, Decodable, HashStable_Generic)]
pub enum TokenTree {
/// A single token
/// A single token.
Token(Token),
/// A delimited sequence of token trees
/// A delimited sequence of token trees.
Delimited(DelimSpan, DelimToken, TokenStream),
}

Expand All @@ -62,7 +62,7 @@ where
}

impl TokenTree {
/// Checks if this TokenTree is equal to the other, regardless of span information.
/// Checks if this `TokenTree` is equal to the other, regardless of span information.
pub fn eq_unspanned(&self, other: &TokenTree) -> bool {
match (self, other) {
(TokenTree::Token(token), TokenTree::Token(token2)) => token.kind == token2.kind,
Expand All @@ -73,7 +73,7 @@ impl TokenTree {
}
}

/// Retrieves the TokenTree's span.
/// Retrieves the `TokenTree`'s span.
pub fn span(&self) -> Span {
match self {
TokenTree::Token(token) => token.span,
Expand Down Expand Up @@ -140,7 +140,7 @@ impl CreateTokenStream for TokenStream {
}
}

/// A lazy version of `TokenStream`, which defers creation
/// A lazy version of [`TokenStream`], which defers creation
/// of an actual `TokenStream` until it is needed.
/// `Box` is here only to reduce the structure size.
#[derive(Clone)]
Expand Down Expand Up @@ -188,11 +188,12 @@ impl<CTX> HashStable<CTX> for LazyTokenStream {
}
}

/// A `TokenStream` is an abstract sequence of tokens, organized into `TokenTree`s.
/// A `TokenStream` is an abstract sequence of tokens, organized into [`TokenTree`]s.
///
/// The goal is for procedural macros to work with `TokenStream`s and `TokenTree`s
/// instead of a representation of the abstract syntax tree.
/// Today's `TokenTree`s can still contain AST via `token::Interpolated` for back-compat.
/// Today's `TokenTree`s can still contain AST via `token::Interpolated` for
/// backwards compatability.
#[derive(Clone, Debug, Default, Encodable, Decodable)]
pub struct TokenStream(pub(crate) Lrc<Vec<TreeAndSpacing>>);

Expand Down Expand Up @@ -429,7 +430,7 @@ impl TokenStreamBuilder {
}
}

/// By-reference iterator over a `TokenStream`.
/// By-reference iterator over a [`TokenStream`].
#[derive(Clone)]
pub struct CursorRef<'t> {
stream: &'t TokenStream,
Expand Down Expand Up @@ -457,8 +458,8 @@ impl<'t> Iterator for CursorRef<'t> {
}
}

/// Owning by-value iterator over a `TokenStream`.
/// FIXME: Many uses of this can be replaced with by-reference iterator to avoid clones.
/// Owning by-value iterator over a [`TokenStream`].
// FIXME: Many uses of this can be replaced with by-reference iterator to avoid clones.
#[derive(Clone)]
pub struct Cursor {
pub stream: TokenStream,
Expand Down
10 changes: 2 additions & 8 deletions compiler/rustc_ast_lowering/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -206,8 +206,7 @@ pub trait ResolverAstLowering {
) -> LocalDefId;
}

type NtToTokenstream =
fn(&Nonterminal, &ParseSess, Span, CanSynthesizeMissingTokens) -> TokenStream;
type NtToTokenstream = fn(&Nonterminal, &ParseSess, CanSynthesizeMissingTokens) -> TokenStream;

/// Context of `impl Trait` in code, which determines whether it is allowed in an HIR subtree,
/// and if so, what meaning it has.
Expand Down Expand Up @@ -417,12 +416,7 @@ impl<'a> TokenStreamLowering<'a> {
fn lower_token(&mut self, token: Token) -> TokenStream {
match token.kind {
token::Interpolated(nt) => {
let tts = (self.nt_to_tokenstream)(
&nt,
self.parse_sess,
token.span,
self.synthesize_tokens,
);
let tts = (self.nt_to_tokenstream)(&nt, self.parse_sess, self.synthesize_tokens);
TokenTree::Delimited(
DelimSpan::from_single(token.span),
DelimToken::NoDelim,
Expand Down
6 changes: 6 additions & 0 deletions compiler/rustc_error_codes/src/error_codes/E0435.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@ let foo = 42;
let a: [u8; foo]; // error: attempt to use a non-constant value in a constant
```

'constant' means 'a compile-time value'.

More details can be found in the [Variables and Mutability] section of the book.

[Variables and Mutability]: https://doc.rust-lang.org/book/ch03-01-variables-and-mutability.html#differences-between-variables-and-constants

To fix this error, please replace the value with a constant. Example:

```
Expand Down
2 changes: 1 addition & 1 deletion compiler/rustc_expand/src/base.rs
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ impl Annotatable {
}

crate fn into_tokens(self, sess: &ParseSess) -> TokenStream {
nt_to_tokenstream(&self.into_nonterminal(), sess, DUMMY_SP, CanSynthesizeMissingTokens::No)
nt_to_tokenstream(&self.into_nonterminal(), sess, CanSynthesizeMissingTokens::No)
}

pub fn expect_item(self) -> P<ast::Item> {
Expand Down
1 change: 0 additions & 1 deletion compiler/rustc_expand/src/expand.rs
Original file line number Diff line number Diff line change
Expand Up @@ -743,7 +743,6 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
AttrStyle::Inner => rustc_parse::fake_token_stream(
&self.cx.sess.parse_sess,
&item.into_nonterminal(),
span,
),
};
let attr_item = attr.unwrap_normal_item();
Expand Down
7 changes: 1 addition & 6 deletions compiler/rustc_expand/src/proc_macro.rs
Original file line number Diff line number Diff line change
Expand Up @@ -94,12 +94,7 @@ impl MultiItemModifier for ProcMacroDerive {
let input = if item.pretty_printing_compatibility_hack() {
TokenTree::token(token::Interpolated(Lrc::new(item)), DUMMY_SP).into()
} else {
nt_to_tokenstream(
&item,
&ecx.sess.parse_sess,
DUMMY_SP,
CanSynthesizeMissingTokens::Yes,
)
nt_to_tokenstream(&item, &ecx.sess.parse_sess, CanSynthesizeMissingTokens::Yes)
};

let server = proc_macro_server::Rustc::new(ecx);
Expand Down
2 changes: 1 addition & 1 deletion compiler/rustc_expand/src/proc_macro_server.rs
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ impl FromInternal<(TreeAndSpacing, &'_ ParseSess, &'_ mut Vec<Self>)>
{
TokenTree::Ident(Ident::new(sess, name.name, is_raw, name.span))
} else {
let stream = nt_to_tokenstream(&nt, sess, span, CanSynthesizeMissingTokens::No);
let stream = nt_to_tokenstream(&nt, sess, CanSynthesizeMissingTokens::No);
TokenTree::Group(Group {
delimiter: Delimiter::None,
stream,
Expand Down
2 changes: 1 addition & 1 deletion compiler/rustc_lexer/src/cursor.rs
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ impl<'a> Cursor<'a> {

#[cfg(not(debug_assertions))]
{
'\0'
EOF_CHAR
}
}

Expand Down
9 changes: 9 additions & 0 deletions compiler/rustc_mir/src/transform/inline.rs
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,15 @@ impl<'tcx> MirPass<'tcx> for Inline {
return;
}

if tcx.sess.opts.debugging_opts.instrument_coverage {
// Since `Inline` happens after `InstrumentCoverage`, the function-specific coverage
// counters can be invalidated, such as by merging coverage counter statements from
// a pre-inlined function into a different function. This kind of change is invalid,
// so inlining must be skipped. Note: This check is performed here so inlining can
// be disabled without preventing other optimizations (regardless of `mir_opt_level`).
return;
}

if inline(tcx, body) {
debug!("running simplify cfg on {:?}", body.source);
CfgSimplifier::new(body).simplify();
Expand Down
20 changes: 8 additions & 12 deletions compiler/rustc_parse/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,6 @@ pub fn parse_in<'a, T>(
pub fn nt_to_tokenstream(
nt: &Nonterminal,
sess: &ParseSess,
span: Span,
synthesize_tokens: CanSynthesizeMissingTokens,
) -> TokenStream {
// A `Nonterminal` is often a parsed AST item. At this point we now
Expand All @@ -256,11 +255,9 @@ pub fn nt_to_tokenstream(
|tokens: Option<&LazyTokenStream>| tokens.as_ref().map(|t| t.create_token_stream());

let tokens = match *nt {
Nonterminal::NtItem(ref item) => {
prepend_attrs(sess, &item.attrs, nt, span, item.tokens.as_ref())
}
Nonterminal::NtItem(ref item) => prepend_attrs(sess, &item.attrs, nt, item.tokens.as_ref()),
Nonterminal::NtBlock(ref block) => convert_tokens(block.tokens.as_ref()),
Nonterminal::NtStmt(ref stmt) => prepend_attrs(sess, stmt.attrs(), nt, span, stmt.tokens()),
Nonterminal::NtStmt(ref stmt) => prepend_attrs(sess, stmt.attrs(), nt, stmt.tokens()),
Nonterminal::NtPat(ref pat) => convert_tokens(pat.tokens.as_ref()),
Nonterminal::NtTy(ref ty) => convert_tokens(ty.tokens.as_ref()),
Nonterminal::NtIdent(ident, is_raw) => {
Expand All @@ -277,31 +274,30 @@ pub fn nt_to_tokenstream(
if expr.tokens.is_none() {
debug!("missing tokens for expr {:?}", expr);
}
prepend_attrs(sess, &expr.attrs, nt, span, expr.tokens.as_ref())
prepend_attrs(sess, &expr.attrs, nt, expr.tokens.as_ref())
}
};

if let Some(tokens) = tokens {
return tokens;
} else if matches!(synthesize_tokens, CanSynthesizeMissingTokens::Yes) {
return fake_token_stream(sess, nt, span);
return fake_token_stream(sess, nt);
} else {
let pretty = rustc_ast_pretty::pprust::nonterminal_to_string_no_extra_parens(&nt);
panic!("Missing tokens at {:?} for nt {:?}", span, pretty);
panic!("Missing tokens for nt {:?}", pretty);
}
}

pub fn fake_token_stream(sess: &ParseSess, nt: &Nonterminal, span: Span) -> TokenStream {
pub fn fake_token_stream(sess: &ParseSess, nt: &Nonterminal) -> TokenStream {
let source = pprust::nonterminal_to_string(nt);
let filename = FileName::macro_expansion_source_code(&source);
parse_stream_from_source_str(filename, source, sess, Some(span))
parse_stream_from_source_str(filename, source, sess, Some(nt.span()))
}

fn prepend_attrs(
sess: &ParseSess,
attrs: &[ast::Attribute],
nt: &Nonterminal,
span: Span,
tokens: Option<&tokenstream::LazyTokenStream>,
) -> Option<tokenstream::TokenStream> {
if attrs.is_empty() {
Expand All @@ -312,7 +308,7 @@ fn prepend_attrs(
// FIXME: Correctly handle tokens for inner attributes.
// For now, we fall back to reparsing the original AST node
if attr.style == ast::AttrStyle::Inner {
return Some(fake_token_stream(sess, nt, span));
return Some(fake_token_stream(sess, nt));
}
builder.push(attr.tokens());
}
Expand Down
2 changes: 1 addition & 1 deletion compiler/rustc_passes/src/check_attr.rs
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,7 @@ impl CheckAttrVisitor<'tcx> {
return false;
}
let item_name = self.tcx.hir().name(hir_id);
if item_name.to_string() == doc_alias {
if &*item_name.as_str() == doc_alias {
self.tcx
.sess
.struct_span_err(
Expand Down
8 changes: 7 additions & 1 deletion compiler/rustc_resolve/src/diagnostics.rs
Original file line number Diff line number Diff line change
Expand Up @@ -398,13 +398,19 @@ impl<'a> Resolver<'a> {
err.help("use the `|| { ... }` closure form instead");
err
}
ResolutionError::AttemptToUseNonConstantValueInConstant => {
ResolutionError::AttemptToUseNonConstantValueInConstant(ident, sugg) => {
let mut err = struct_span_err!(
self.session,
span,
E0435,
"attempt to use a non-constant value in a constant"
);
err.span_suggestion(
ident.span,
&sugg,
"".to_string(),
Applicability::MaybeIncorrect,
);
err.span_label(span, "non-constant value");
err
}
Expand Down
Loading

0 comments on commit c8915ee

Please sign in to comment.