What are Rust’s exact auto-dereferencing rules?

I’m learning/experimenting with Rust, and in all the elegance that I find in this language, there is one peculiarity that baffles me and seems totally out of place.

Rust automatically dereferences pointers when making method calls. I made some tests to determine the exact behaviour:

struct X { val: i32 }
impl std::ops::Deref for X {
    type Target = i32;
    fn deref(&self) -> &i32 { &self.val }
}

trait M { fn m(self); }
impl M for i32   { fn m(self) { println!("i32::m()");  } }
impl M for X     { fn m(self) { println!("X::m()");    } }
impl M for &X    { fn m(self) { println!("&X::m()");   } }
impl M for &&X   { fn m(self) { println!("&&X::m()");  } }
impl M for &&&X  { fn m(self) { println!("&&&X::m()"); } }

trait RefM { fn refm(&self); }
impl RefM for i32  { fn refm(&self) { println!("i32::refm()");  } }
impl RefM for X    { fn refm(&self) { println!("X::refm()");    } }
impl RefM for &X   { fn refm(&self) { println!("&X::refm()");   } }
impl RefM for &&X  { fn refm(&self) { println!("&&X::refm()");  } }
impl RefM for &&&X { fn refm(&self) { println!("&&&X::refm()"); } }


struct Y { val: i32 }
impl std::ops::Deref for Y {
    type Target = i32;
    fn deref(&self) -> &i32 { &self.val }
}

struct Z { val: Y }
impl std::ops::Deref for Z {
    type Target = Y;
    fn deref(&self) -> &Y { &self.val }
}


#[derive(Clone, Copy)]
struct A;

impl M for    A { fn m(self) { println!("A::m()");    } }
impl M for &&&A { fn m(self) { println!("&&&A::m()"); } }

impl RefM for    A { fn refm(&self) { println!("A::refm()");    } }
impl RefM for &&&A { fn refm(&self) { println!("&&&A::refm()"); } }


fn main() {
    // I'll use @ to denote left side of the dot operator
    (*X{val:42}).m();        // i32::m()    , Self == @
    X{val:42}.m();           // X::m()      , Self == @
    (&X{val:42}).m();        // &X::m()     , Self == @
    (&&X{val:42}).m();       // &&X::m()    , Self == @
    (&&&X{val:42}).m();      // &&&X:m()    , Self == @
    (&&&&X{val:42}).m();     // &&&X::m()   , Self == *@
    (&&&&&X{val:42}).m();    // &&&X::m()   , Self == **@
    println!("-------------------------");

    (*X{val:42}).refm();     // i32::refm() , Self == @
    X{val:42}.refm();        // X::refm()   , Self == @
    (&X{val:42}).refm();     // X::refm()   , Self == *@
    (&&X{val:42}).refm();    // &X::refm()  , Self == *@
    (&&&X{val:42}).refm();   // &&X::refm() , Self == *@
    (&&&&X{val:42}).refm();  // &&&X::refm(), Self == *@
    (&&&&&X{val:42}).refm(); // &&&X::refm(), Self == **@
    println!("-------------------------");

    Y{val:42}.refm();        // i32::refm() , Self == *@
    Z{val:Y{val:42}}.refm(); // i32::refm() , Self == **@
    println!("-------------------------");

    A.m();                   // A::m()      , Self == @
    // without the Copy trait, (&A).m() would be a compilation error:
    // cannot move out of borrowed content
    (&A).m();                // A::m()      , Self == *@
    (&&A).m();               // &&&A::m()   , Self == &@
    (&&&A).m();              // &&&A::m()   , Self == @
    A.refm();                // A::refm()   , Self == @
    (&A).refm();             // A::refm()   , Self == *@
    (&&A).refm();            // A::refm()   , Self == **@
    (&&&A).refm();           // &&&A::refm(), Self == @
}

(Playground)

So, it seems that, more or less:

  • The compiler will insert as many dereference operators as necessary to invoke a method.
  • The compiler, when resolving methods declared using &self (call-by-reference):
    • First tries calling for a single dereference of self
    • Then tries calling for the exact type of self
    • Then, tries inserting as many dereference operators as necessary for a match
  • Methods declared using self (call-by-value) for type T behave as if they were declared using &self (call-by-reference) for type &T and called on the reference to whatever is on the left side of the dot operator.
  • The above rules are first tried with raw built-in dereferencing, and if there’s no match, the overload with Deref trait is used.

What are the exact auto-dereferencing rules? Can anyone give any formal rationale for such a design decision?

3 Answers
3

Leave a Comment