-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow types to be parameterized by integer (and bool) constant values. #884
Conversation
…lause details and some alternative sections.
Thanks for putting this together. I plan to read through it more carefully but I wanted to mention that some of the type-level stuff in Shoggoth is currently being redesigned and is not fully reimplemented but should be back to normal in a few days. In the meantime, you may want to change the link from the RFC to this revision. |
To facilitate triage #262 and #273 should probably be closed in favour of this. On the other hand, a meta-issue to track advancing this feature in stages might be appropriate. @quantheory already listed this as:
|
@quantheory First, I want to say that I'm definitely on board with the idea that we need a way to be more precise about types, and indexing by numeric values and also other data is a feature I would very much like to have in Rust. There are a couple of specific points I'll comment on, first regarding using macros and type-level programming to emulate or "fake" this functionality versus the constant values proposal, since it is mentioned in the RFC and the former is what I've been using and thinking about a lot recently. Then I'll conclude with some remarks from a slightly different perspective about this issue. Like I mentioned in the other RFC thread though, I don't consider type-level macros to be intended as a direct solution to this issue, but more about making what's already possible more convenient. comments on comparison with type-level macros
It's true that type-level numerics implemented as a library would require users to depend on plugins of that library in the rest of their code. I also agree that not having these facilities builtin would mean additional programming burden. However, I think this is an issue affecting any feature-as-a-library. I do agree that in an ideal case, if type-level numerics were to become a thing, it would be best if basic functionality were provided by the compiler and standard libraries. This is what GHC does for type level naturals. Idris also treats type-level naturals (well, they aren't exactly type-level anymore) specially: I believe they still have a unary (Peano) representation externally but an optimized representation internally. Theoretically, Rust could do something similar without much effort.
I don't view type-level macros as a replacement or substitute for something like singletons or promotion with data-lkinds. I think that a comprehensive story for type-level programming in Rust will eventually have to encompass those ideas somehow. In fact, I think that it would be optimal to tackle data kinds and higher-kinded types at the same time, as part of a comprehensive kinding story. However, it is worth pointing out that the macro system (with type-level macros) would already be enough to emulate much of what you would get with automatic promotion. It wouldn't be as ergonomic to use of course, but it's a low-commitment solution. In fact, most of the What a macro solution for that doesn't address though is how to have efficient internal representations of certain data. This is a more difficult problem but not one without some potential solutions and literature on the matter. (There would probably be some overlap with such a mechanism and a hypothetical "newtype deriving" feature).
I think what you are getting at here is that macros are not extensible. (At least, not as far as I know…). No doubt this is a challenge. It may not be as bad as it sounds though because one should be able to make the interpreters underlying the macros (which translate the expressions) modular with some additional effort. This is possible because traits are open and you can make the macro piggyback/defer to individual implementations in order to proceed.
I don't quite understand this point. If you mean that type-level macro could trigger computation, well, yes. But this is going to be the case for anything that isn't a canonical value, even for these constants using operations like If the concern is about non-termination, I don't even think it's possible to prevent that since there's no way to provide evidence that any plugin produces its complete output in finite time. There's also the issue about associated types…
I'm thinking what you mean here is that type-level representations of data don't have a unique kind? Crucially, at the term-level they do have types, which is both how they can be manipulated using associated types and how it's possible to bridge the term and type level using singletons, i.e., types with a single value. struct _0;
struct _1;
let two: (_1, _0) = (_1, _0); In other words, each representation of the value at the type-level is a unique type, it's just that without kinds there is is no way to classify the extent of such value representations. However, it is possible to approximate kinds with traits: trait Bit {}
impl Bit for _0 {}
impl Bit for _1 {} In fact, this seems to be very similar to what you are doing with the In order for that to be effective, the compiler will need to know it is a "sealed" trait (e.g., has only a fixed number of known implementations), which unfortunately is not a concept that can currently be expressed in Rust. Maybe this is what you mean by special compiler support. (I'll come back to this point later.) comments on const value parameters
I'm not sure if this is an apples to apples comparison though. You're still proposing adding special syntax to the type system, e.g., overall commentsJust to re-emphasize, I agree with the principle behind this RFC and want the kind of functionality described. I have some different thoughts on how to get there but I actually think it's more a matter of perspective which I'll try to explain. I think one of the keys issues here is what all of this should mean in terms of kinds. You do mention data-kinds and briefly higher-kinded types, but don't go into a lot of detail. My thoughts are basically that in order to do this nicely with regard to future extensibility of the language, the constants should really have a kind, distinct from At that point, I think the remaining distinction between providing the described RFC functionality as a special case of an embedded arithmetic language versus fitting into the more general type-level programming picture boils down to a matter of efficiency. Algebraic representations of numbers in unary fashion are just unusable, I think we all agree. Binary representations are efficient enough to use but perhaps not optimal. However, I don't think we necessarily have to make a choice between efficient builtin representations and less efficient but easy to program with external representations. It seems quite possible to provide builtin efficient representations for some kinds like So I think it's possible to reach a point where more or less the same functionality is available, along with mostly the same convenient interface, but without having to special case arithmetic and bools and forgo more general abstraction facilities like type-level operations implemented using traits. To be honest, I suspect it would require a similar amount of effort to implement this RFC directly as it would to implement a better kind story, along with sealed traits, promotion/singleton literals. (The proposed functionality is essentially relying on these ideas implicitly I think, just not as first-class features.) But with the latter, we'd have a lot more flexibility in the long run and less pressure to add a whole slew of specialized arithmetic operations and related functionality because they could be defined by library authors. Just my 0.02, and like I said, I don't think the conclusions are that different, maybe just how to get there. I also want to say, I'm not suggesting delaying such a feature until a better story about kinds is actually lands in Rust, especially since not having constant values for arrays is a real pain at the moment. Just my opinion that these points should perhaps be considered more thoroughly before going too far down the rabbit hole. |
+1 for kinds, or at least do it forward-compatibly ;_; |
@freebroccolo (Going with GHC as the reference mainly because I'm familiar with it): In GHC without extensions there's the arrow kind What I'm wondering about is if it would make sense to structure this slightly differently for Rust, by having a single base kind for compile-time constants, parameterized over the constant's type. So using GHC syntax our base kinds would be |
I think that you discuss a lot of relevant points, so thanks for sharing your thoughts. We are already on the same page about the majority of this, so I'll confine my response to the points where I think that there may still be some confusion or real disagreement. Nonetheless, I'm afraid that this will be quite long.
Hmm, maybe examples will help. I admit that the second one below is contrived (I made it up off the top of my head), and uses associated consts, which are not implemented in rustc yet. Still, it's plausible that the general pattern could be useful down the line. // Example 1: This could be difficult.
const TROUBLE: usize = 32;
struct Foo<N: Nat> {};
fn something_dull<N: Nat>() -> Foo<Expr!(8 * N)> { /* do something */ }
// Example 2: This is worse.
trait SomeTrait {
const N: usize;
}
fn use_some_nat<M: Nat>() { /* do something */ }
fn some_fun<T: SomeTrait>() {
const DOUBLE_TROUBLE: usize = 2*(<T as SomeTrait>::N);
use_some_nat::<Expr!(DOUBLE_TROUBLE + 64)>();
} Even to have the plugin implementing Example 2 is worse in that now it really is impossible to handle unless the
Yes. I should have talked more about coming up with a kind story, since it's apparently not clear that that's exactly what I had in mind. So to be quite explicit, the point of this proposal is in fact that
Just as a very brief note, perhaps we should start thinking about variadic types as representing another form of kind-polymorphism. I hadn't given that much thought yet. I guess we could think of Although it's a separate thing, my mental model here is that there is a
Yes. I was thinking of
I'm certainly open to syntax changes, but as I hinted at above, I view
I think our disconnect is that I really haven't placed a lot of value on the latter. I find inductive representations to be entertaining but not especially "easy to program with", and so I have difficulty finding the motivation to give them a central role here unless it's clear what additional capability they provide. We already have constant expressions, a plan to implement constants that are outputs of traits, and arrays that accept This is also how I would respond to the comment about implementing arithmetic as a "special case". If we started with this proposal and later expanded to a fuller dependent type system (assuming CTFE as a part of that), there would be nothing special about type-level integers except the specialness that's there purely as a consequence of the fact that value-level integers are already special themselves.
I think that doing the latter, in a way that integrates well with value-level consts and other existing Rust features, requires tackling a superset of the the features proposed here. So I don't agree with the comparison.
Yes, I think/hope so. It may be good for me to think about edits to the RFC that make it more clear how I picture this fitting into a kind system. |
@quantheory Thanks for responding. I wanted to comment on a few of the points you raise and try to get to the others when I have more time:
From my perspective, the inductive representations are crucial for being able to define new type-level functions/operations using traits and associated types. Unless I missed something in the RFC, there is no way to introduce new functions/operations in this proposed arithmetic fragment. In order to do something like that, you'd need at least pattern matching and recursion or a higher-order induction operator and lambdas. If the only way to introduce new functionality into the arithmetic fragment means having to make more things builtin, I don't see how this is going to be useful in general for type-level programming. As a meta-comment, I don't think you can even really have "dependently typed programming" without providing convenient inductive representations of data just due to how type information has to be propagated throughout structures and computation and how things like proof witnesses are constructed. You can provide non-inductive things with some opaque internal structure along with various builtin higher-order induction operations, but these are far less "easy to program with". For example, consider the case where you need to provide evidence that an integer satisfies some property. Maybe you need an operation that only works for some
What concerns me is I have a hard time seeing a clear path from this RFC and a future Rust with a good type-level programming story across the board. It seems you're suggesting several other non-trivial RFCs in order to get to a point that might be comparable to what can already be done with traits/associated-types/macros. I'll be the first to admit that type-level programming with traits/associated-types/macros is quite clunky and I'd definitely prefer a better way. But given the flexibility and generality it already offers, I'm hesitant to welcome an alternative that won't have comparable flexibility unless several additional unspecified changes to the language also land at some point. For example, I don't see how from this proposal, even with some of the future RFCs you've suggested, that I could implement most of the existing or planned features in Shoggoth. Maybe that's okay, maybe this RFC and the follow ups aren't really intended to address such things. But if not, I think there ought to be a stronger case for why constant value parameterization deserves such extensive changes and dedicated machinery when it doesn't address so many use cases. From my perspective, the most convincing argument for following this RFC seems to be convenience in certain use cases. As far as arrays are concerned, constants seem like a reasonable idea and are definitely worth pursuing in the short term. But as far as following this approach and generalizing with several follow up RFCs, I'm not convinced yet on the technical side, especially concerning such open ended and unspecified things like CTFE. For convenience in the general type-level programming case, I think we can achieve this sooner, with less effort, and less future complications by building on what we already have in smaller steps (sealed traits, data-kinds/promotion/singletons, efficient type representations) rather than by introducing entirely new mechanisms (constant values vs traits/associated-types, CTFE vs macros) which require such extensive changes.
The reason I say this is because in order to implement the proposed functionality in the RFC, you will already need to have some internal notion like special kinds for the different type of constants, you will need to treat |
Wow. This, like, the model of a good RFC. From a quick glance it seems to mirror a lot of my thoughts about generics and constants, but a lot more in-depth and thought-out. I need to spend time and digest this more thoroughly :) |
Thanks! I know it's a busy time to open a request like this, and it even might end up being postponed for that reason alone, but I'm trying to at least be a good citizen regarding the form of the RFC. |
Wow, this is fantastic! I'm glad you settled on a conservative primitives-only design. That's the functionality we really need out of this to talk efficiently about stack-allocated structures. I'm still a bit nervous about arbitrary expressions ( I'm also a fan of the alternative |
Nice RFC! I like the I’m not a huge fan of the |
The ';' syntax for distinguishing non-type parameters is potentially problematic because it makes it impossible for type bounds to depend on non-type parameters: |
I see no reason why this shouldn't work. |
@P1start I don't have a particularly strong feeling about @Diggsey We already allow use of parameters before they are declared for types (as long as the declaration is later in the same list), so I don't think that this is an issue. Even if it was, you could always just move the bound to a |
@Diggsey we already allow both:
However this does raise some concerns for me with how exactly this proposal interacts with defaults. In particular the precise syntax is less of a bikeshed than I thought.
The last case without the Dependency resolution for the Some motivating usage examples might help clarify what default rules we inherit. Personally, I want to be able to do:
So with BTree I have a clear motivating example of wanting defaults to be independently specified (allocator and B are inherently decoupled), but both syntaxes allow that. I don't have any usecase for allowing "cross references" with the |
My intention was that only |
Is it there an implementation of this behind a feature gate? |
@gnzlbg I've wanted to take a crack at it, but I haven't found the time yet. I was at a conference most of last week, and will be on vacation most of next week, so unless someone else wants to give it a try, it'll be a while. |
Just a thought: the // Rather than:
impl Gnarl<i32, {4+4}, u64> for Darl { /* ... */ }
// Instead:
impl Gnarl<i32, const {4+4}, u64> for Darl { /* ... */ } This is more verbose, but it has the advantage that the same syntax can then be used elsewhere with the same meaning. This could be added in a later RFC to allow programmers to ensure that particular expressions get constant-folded, and later extended to trigger CTFE. // Require CTFE evaluation:
let x = const { sin(1.3) }; I bring this up because one thing that's always a little dicey is storing an expression in a |
I finally got around to reading this and it was worth it: This is one of the most exciting RFCs I've read so far. I would love to have this functionality in Rust. It always bothered me a bit that arrays get to have length parameters as part of their type but library types could not do this. I generally like the proposed syntax using |
While this is a feature Rust will likely get at some point, this is a large feature and I'm inclined not to do it now. |
@brson, would you care to elaborate what motivates your inclination? Maybe such insights could inform improvements to the plans for this feature outlined here. |
@brson I will most likely revise this RFC (and even more so, #865) to account for that experience, and to remove the |
@aepsil0n My main reasoning is that this is a deep type system change, and none of the core team is prepared to tackle this problem yet. We're in a mode where we want to gain experience with the language we've built and that has gone through such drastic churn recently. Postponing this issue. For further planning and discussion see #1038 |
I hope you guys are at least planning to come back to this issue (I hope soon). There are many reasons where you need a copyable struct, which can not be done with Vec, but yet the code reuse paradigm would force you to input some form of Size of some field. I'm currently doing a math library, and instead of resolving the size at compile time, I have to in fact check it dynamically and panic! if you operate with wrong sizes. |
See also #1038. |
I was somewhat reluctant to push this now, because there are so many other things going on (and I personally have opened a few smaller PRs recently). However, I think that the basic design we discussed on the forum has reached a stable point.
This is also relevant to the motivation discussed for the macros-in-types proposal (#873). While that proposal does not directly compete with this one, it does suggest implementing some of the features that are the main thrust of this proposal through the macro system instead.
Rendered.