Skip to content

Delay trait using core::time::Duration #523

Open
@madsmtm

Description

@madsmtm

Is there a reason that the embedded_hal::delay::DelayUs API is not defined as follows:

use core::time::Duration;

pub trait Delay {
    fn delay(&mut self, duration: Duration);

    #[inline]
    fn delay_us(&mut self, us: u32) {
        self.delay(Duration::from_micros(us))
    }

    #[inline]
    fn delay_ms(&mut self, ms: u32) {
        self.delay(Duration::from_millis(us))
    }
}

impl<T: Delay> Delay for &mut T { ... }

I haven't actually implemented this API myself, but it seems like this would be both much cleaner, and much more flexible (e.g. allowing nanosecond resolution as well)?

Is the issue that core::time::Duration is too large to pass around?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions