~sebsite

https://sebsite.pw/

i'm a human i think

Trackers

~sebsite/generic-tetromino-game

Last active 8 months ago

#959 ensuring $PATH lookup behaves as expected 2 months ago

Comment by ~sebsite on ~sircmpwn/hare

I imagine we probably don't want to replicate libc's behavior of consulting confstr(_CS_PATH) if PATH isn't defined (right?)

We won't be able to completely match libc's execp behavior in lookup, because of separated lookup and exec stages

Why not?

#960 issues with array literal type inference 2 months ago

Comment by ~sebsite on ~sircmpwn/hare

Another issue I thought of:

let a = [[1], [1u8]]; // [2][1]u8

This implies that [1]'s type (before lowering) is [1]iconst. This would need to be handled in other cases. Like, if it's used in a tuple literal. Or when taking its address, would it become *[1]iconst, or would it be lowered like &0 is? What about slicing? Would []iconst be a thing? And if we allow the case you gave, then it would also make sense to allow [&0, &0u8], which would make *iconst a thing.

This would also cause issues if/when we add type(expr), (or typeof(expr), whatever). We want to disallow taking the type of a flexible literal, but type([1, 2, 3]) should be allowed IMO.

#960 issues with array literal type inference 2 months ago

Comment by ~sebsite on ~sircmpwn/hare

Ooh, what about expandable arrays?

let a = [[0...], [1, 2, 3]]; // [2][3]int ?

#960 issues with array literal type inference 2 months ago

Comment by ~sebsite on ~sircmpwn/hare

We could afford to be a bit more permissive here (especially with null), but I don't think it's a big deal to be order dependent. It simplifies the compiler, and allows you to figure out the array's member type just by looking at the first member. We already have a bunch of type deduction systems; it would suck to add yet another one, outside of some simple stuff like with nulls and nullable pointers.

Type reduction has some weird semantics that would make it unsuitable for use here without modification (see #945, and my comment on it):

let x: (*int | nullable *int) = &0;
let a = [x, x];

a's type should be [2](*int | nullable *int), but with type reduction it would be [2]nullable *int.

Related to #952, we should also decide whether we want to allow never expressions: let a = [1, abort()]; (or with flipped order).

Some comments on some of the cases you gave:

let a = [[], [1]: []u8]; // [2][]u8

With the slice assignment RFC, this would become let a = [null, &[1]: []u8];, which makes it easier to make that work.

let a = [[1], [1u8]]; // [2][1]u8

I feel like I remember this intentionally not working (in that order)? Because it would complicate the compiler or something? ~ecs would know more about that, since it involves flexible types.

let a = [[1u8], [1i]]; // should not be [2]([1]u8 | [1]int)

What about [[1u8], [1i], [1i: (u8 | int)]]?

let a = [[1,2,3], [1,2]]; // should not be [2]([2]int | [3]int)

Speaking of the slice assignability RFC: should [&[1, 2, 3], &[1, 2]] be allowed (with type [2][]int)? How about [[1, 2, 3][..], &[1, 2]]? I don't think those should be allowed, but if the other cases you gave are allowed, then allowing these would make some sense.

#904 Tuple indexing lexing issues 2 months ago

Comment by ~sebsite on ~sircmpwn/hare

I've been thinking about this, and... maybe this is Fine actually. Like, it feels very icky that lexing is context-dependent, but the implementation isn't actually that difficult. The current syntax is nice to use, so maybe we should just accept that it introduces a quirk in the lexer and leave it as-is?

#957 Issues with default arguments 2 months ago

design added by ~sebsite on ~sircmpwn/hare

#957 Issues with default arguments 2 months ago

spec added by ~sebsite on ~sircmpwn/hare

#957 Issues with default arguments 2 months ago

harec added by ~sebsite on ~sircmpwn/hare

#957 Issues with default arguments 2 months ago

bug added by ~sebsite on ~sircmpwn/hare

#957 Issues with default arguments 2 months ago

Ticket created by ~sebsite on ~sircmpwn/hare

  • NaNs currently cause a harec assertion failure, because the ability to hash NaNs isn't yet implemented, nor is it clear how it even should be implemented.
  • The spec only restricts default arguments to the translation compatible expression subset, but this poses an issue: how should pointers be hashed? They can't be compared at compile-time, and so it follows that it's not possible to give them a unique hash. We could "solve" this by disallowing addresses in default arguments, but that kinda sucks, since they could potentially be useful.
  • Default arguments sometimes aren't emitted correctly in typedefs. For one, aliases of void emit as plain void (and cause harec to error later). Additionally, default arguments are only emitted in declarations; not in types. This can cause compiler errors, or potentially even miscompilations. It also makes error messages inaccurate, since the type named by the error message doesn't include the default argument.
  • Slices aren't hashed correctly: [1, 2, 3, 4], [1, 2, 3, 4][..3], and [1, 2, 3, 4][1..] are all considered equivalent, whereas [1, 2, 3] and [1, 2, 3, 4][..3] aren't.
  • Expression hashing is underspecified, and so it's sometimes unclear whether two expressions should hash equivalently:
    • Do 0.0 and -0.0 hash equivalently? Currently no, and that's probably correct, but it's notable, because they "compare" equal everywhere else. That is, if harec's current behavior is correct, then we need to specify that the rules for expression hash equivalence may be different than the rules for expression equality. This is also relevant for NaNs, if we decide that all NaNs should be hashed equivalently.
    • Should t { x = 0 } and t { ... } hash equivalently? They currently do in harec, but that should be explicitly specified.
    • If we allow address arguments, then given let x: union { x: int, y: int }, are &x.x and &x.y hashed equivalently?
    • When are two slices equivalent? Does that question even make sense? They're stack-allocated, so the "value" will be different everytime the function is called.
    • How about strings? Intuitively, string literals should be hashed by their contents (as they currently are). But equivalent string literals aren't guaranteed to refer to the same object, so two "identical" strings may (or may not!) point to different addresses. This shouldn't ever cause any real problems, but our current rules with expression hashing sorta imply that functions with equivalent types will have the exact same default arguments (and that the same function will always have the exact same default arguments, regardless of where it's called from), even though this isn't necessarily true.

A pattern with these issues is that they all stem from the fact that default arguments are hashed as part of the function type itself. This necessitates expression hashing; something which isn't used in any other circumstance. It would be much simpler if default arguments were stored as a property of a function declaration. This would prohibit using them in function types, but maybe that tradeoff is worth it. However, all of these issues would also apply to @default in its current form, so unless that RFC is revised or rejected, there's not much point in not hashing them with function types (we should definitely consider making function pointers assignable if they only differ in their default arguments though).