Struct vm_memory::volatile_memory::VolatileSlice
source · pub struct VolatileSlice<'a, B = ()> { /* private fields */ }
Expand description
A slice of raw memory that supports volatile access.
Implementations§
source§impl<'a> VolatileSlice<'a, ()>
impl<'a> VolatileSlice<'a, ()>
sourcepub unsafe fn new(addr: *mut u8, size: usize) -> VolatileSlice<'a>
pub unsafe fn new(addr: *mut u8, size: usize) -> VolatileSlice<'a>
Creates a slice of raw memory that must support volatile access.
§Safety
To use this safely, the caller must guarantee that the memory at addr
is size
bytes long
and is available for the duration of the lifetime of the new VolatileSlice
. The caller
must also guarantee that all other users of the given chunk of memory are using volatile
accesses.
source§impl<'a, B: BitmapSlice> VolatileSlice<'a, B>
impl<'a, B: BitmapSlice> VolatileSlice<'a, B>
sourcepub unsafe fn with_bitmap(
addr: *mut u8,
size: usize,
bitmap: B,
) -> VolatileSlice<'a, B>
pub unsafe fn with_bitmap( addr: *mut u8, size: usize, bitmap: B, ) -> VolatileSlice<'a, B>
Creates a slice of raw memory that must support volatile access, and uses the provided
bitmap
object for dirty page tracking.
§Safety
To use this safely, the caller must guarantee that the memory at addr
is size
bytes long
and is available for the duration of the lifetime of the new VolatileSlice
. The caller
must also guarantee that all other users of the given chunk of memory are using volatile
accesses.
sourcepub fn as_ptr(&self) -> *mut u8
pub fn as_ptr(&self) -> *mut u8
Returns a pointer to the beginning of the slice. Mutable accesses performed using the resulting pointer are not automatically accounted for by the dirty bitmap tracking functionality.
sourcepub fn split_at(&self, mid: usize) -> Result<(Self, Self)>
pub fn split_at(&self, mid: usize) -> Result<(Self, Self)>
Divides one slice into two at an index.
§Example
let vslice = mem_ref
.get_slice(0, 32)
.expect("Could not get VolatileSlice");
let (start, end) = vslice.split_at(8).expect("Could not split VolatileSlice");
assert_eq!(8, start.len());
assert_eq!(24, end.len());
sourcepub fn subslice(&self, offset: usize, count: usize) -> Result<Self>
pub fn subslice(&self, offset: usize, count: usize) -> Result<Self>
Returns a subslice of this VolatileSlice
starting at
offset
with count
length.
The returned subslice is a copy of this slice with the address increased by offset
bytes
and the size set to count
bytes.
sourcepub fn offset(&self, count: usize) -> Result<VolatileSlice<'a, B>>
pub fn offset(&self, count: usize) -> Result<VolatileSlice<'a, B>>
Returns a subslice of this VolatileSlice
starting at
offset
.
The returned subslice is a copy of this slice with the address increased by count
bytes
and the size reduced by count
bytes.
sourcepub fn copy_to<T>(&self, buf: &mut [T]) -> usizewhere
T: ByteValued,
pub fn copy_to<T>(&self, buf: &mut [T]) -> usizewhere
T: ByteValued,
Copies as many elements of type T
as possible from this slice to buf
.
Copies self.len()
or buf.len()
times the size of T
bytes, whichever is smaller,
to buf
. The copy happens from smallest to largest address in T
sized chunks
using volatile reads.
§Examples
let mut mem = [0u8; 32];
let mem_ref = &mut mem[..];
let vslice = mem_ref
.get_slice(0, 32)
.expect("Could not get VolatileSlice");
let mut buf = [5u8; 16];
let res = vslice.copy_to(&mut buf[..]);
assert_eq!(16, res);
for &v in &buf[..] {
assert_eq!(v, 0);
}
sourcepub fn copy_to_volatile_slice<S: BitmapSlice>(
&self,
slice: VolatileSlice<'_, S>,
)
pub fn copy_to_volatile_slice<S: BitmapSlice>( &self, slice: VolatileSlice<'_, S>, )
Copies as many bytes as possible from this slice to the provided slice
.
The copies happen in an undefined order.
§Examples
vslice.copy_to_volatile_slice(
vslice
.get_slice(16, 16)
.expect("Could not get VolatileSlice"),
);
sourcepub fn copy_from<T>(&self, buf: &[T])where
T: ByteValued,
pub fn copy_from<T>(&self, buf: &[T])where
T: ByteValued,
Copies as many elements of type T
as possible from buf
to this slice.
The copy happens from smallest to largest address in T
sized chunks using volatile writes.
§Examples
let mut mem = [0u8; 32];
let mem_ref = &mut mem[..];
let vslice = mem_ref
.get_slice(0, 32)
.expect("Could not get VolatileSlice");
let buf = [5u8; 64];
vslice.copy_from(&buf[..]);
for i in 0..4 {
let val = vslice
.get_ref::<u32>(i * 4)
.expect("Could not get value")
.load();
assert_eq!(val, 0x05050505);
}
Trait Implementations§
source§impl<B: BitmapSlice> Bytes<usize> for VolatileSlice<'_, B>
impl<B: BitmapSlice> Bytes<usize> for VolatileSlice<'_, B>
source§fn write(&self, buf: &[u8], addr: usize) -> Result<usize>
fn write(&self, buf: &[u8], addr: usize) -> Result<usize>
§Examples
- Write a slice of size 5 at offset 1020 of a 1024-byte
VolatileSlice
.
let mut mem = [0u8; 1024];
let mut mem_ref = &mut mem[..];
let vslice = mem_ref.as_volatile_slice();
let res = vslice.write(&[1, 2, 3, 4, 5], 1020);
assert!(res.is_ok());
assert_eq!(res.unwrap(), 4);
source§fn read(&self, buf: &mut [u8], addr: usize) -> Result<usize>
fn read(&self, buf: &mut [u8], addr: usize) -> Result<usize>
§Examples
- Read a slice of size 16 at offset 1010 of a 1024-byte
VolatileSlice
.
let mut mem = [0u8; 1024];
let mut mem_ref = &mut mem[..];
let vslice = mem_ref.as_volatile_slice();
let buf = &mut [0u8; 16];
let res = vslice.read(buf, 1010);
assert!(res.is_ok());
assert_eq!(res.unwrap(), 14);
source§fn write_slice(&self, buf: &[u8], addr: usize) -> Result<()>
fn write_slice(&self, buf: &[u8], addr: usize) -> Result<()>
§Examples
- Write a slice at offset 256.
let res = vslice.write_slice(&[1, 2, 3, 4, 5], 256);
assert!(res.is_ok());
assert_eq!(res.unwrap(), ());
source§fn read_slice(&self, buf: &mut [u8], addr: usize) -> Result<()>
fn read_slice(&self, buf: &mut [u8], addr: usize) -> Result<()>
§Examples
- Read a slice of size 16 at offset 256.
let buf = &mut [0u8; 16];
let res = vslice.read_slice(buf, 256);
assert!(res.is_ok());
source§fn read_from<F>(&self, addr: usize, src: &mut F, count: usize) -> Result<usize>where
F: Read,
fn read_from<F>(&self, addr: usize, src: &mut F, count: usize) -> Result<usize>where
F: Read,
§Examples
- Read bytes from /dev/urandom
let mut file = File::open(Path::new("/dev/urandom")).expect("Could not open /dev/urandom");
vslice
.read_from(32, &mut file, 128)
.expect("Could not read bytes from file into VolatileSlice");
let rand_val: u32 = vslice
.read_obj(40)
.expect("Could not read value from VolatileSlice");
source§fn read_exact_from<F>(
&self,
addr: usize,
src: &mut F,
count: usize,
) -> Result<()>where
F: Read,
fn read_exact_from<F>(
&self,
addr: usize,
src: &mut F,
count: usize,
) -> Result<()>where
F: Read,
§Examples
- Read bytes from /dev/urandom
let mut file = File::open(Path::new("/dev/urandom")).expect("Could not open /dev/urandom");
vslice
.read_exact_from(32, &mut file, 128)
.expect("Could not read bytes from file into VolatileSlice");
let rand_val: u32 = vslice
.read_obj(40)
.expect("Could not read value from VolatileSlice");
source§fn write_to<F>(&self, addr: usize, dst: &mut F, count: usize) -> Result<usize>where
F: Write,
fn write_to<F>(&self, addr: usize, dst: &mut F, count: usize) -> Result<usize>where
F: Write,
§Examples
- Write 128 bytes to /dev/null
let mut file = OpenOptions::new()
.write(true)
.open("/dev/null")
.expect("Could not open /dev/null");
vslice
.write_to(32, &mut file, 128)
.expect("Could not write value from VolatileSlice to /dev/null");
source§fn write_all_to<F>(&self, addr: usize, dst: &mut F, count: usize) -> Result<()>where
F: Write,
fn write_all_to<F>(&self, addr: usize, dst: &mut F, count: usize) -> Result<()>where
F: Write,
§Examples
- Write 128 bytes to /dev/null
let mut file = OpenOptions::new()
.write(true)
.open("/dev/null")
.expect("Could not open /dev/null");
vslice
.write_all_to(32, &mut file, 128)
.expect("Could not write value from VolatileSlice to /dev/null");
source§fn store<T: AtomicAccess>(
&self,
val: T,
addr: usize,
order: Ordering,
) -> Result<()>
fn store<T: AtomicAccess>( &self, val: T, addr: usize, order: Ordering, ) -> Result<()>
source§fn load<T: AtomicAccess>(&self, addr: usize, order: Ordering) -> Result<T>
fn load<T: AtomicAccess>(&self, addr: usize, order: Ordering) -> Result<T>
source§impl<'a, B: Clone> Clone for VolatileSlice<'a, B>
impl<'a, B: Clone> Clone for VolatileSlice<'a, B>
source§fn clone(&self) -> VolatileSlice<'a, B>
fn clone(&self) -> VolatileSlice<'a, B>
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moresource§impl<'a, B: Debug> Debug for VolatileSlice<'a, B>
impl<'a, B: Debug> Debug for VolatileSlice<'a, B>
source§impl<'a, B: BitmapSlice> From<VolatileSlice<'a, B>> for VolatileArrayRef<'a, u8, B>
impl<'a, B: BitmapSlice> From<VolatileSlice<'a, B>> for VolatileArrayRef<'a, u8, B>
source§fn from(slice: VolatileSlice<'a, B>) -> Self
fn from(slice: VolatileSlice<'a, B>) -> Self
source§impl<B: BitmapSlice> VolatileMemory for VolatileSlice<'_, B>
impl<B: BitmapSlice> VolatileMemory for VolatileSlice<'_, B>
source§fn as_volatile_slice(&self) -> VolatileSlice<'_, BS<'_, Self::B>>
fn as_volatile_slice(&self) -> VolatileSlice<'_, BS<'_, Self::B>>
source§fn get_ref<T: ByteValued>(
&self,
offset: usize,
) -> Result<VolatileRef<'_, T, BS<'_, Self::B>>>
fn get_ref<T: ByteValued>( &self, offset: usize, ) -> Result<VolatileRef<'_, T, BS<'_, Self::B>>>
VolatileRef
at offset
.source§fn get_array_ref<T: ByteValued>(
&self,
offset: usize,
n: usize,
) -> Result<VolatileArrayRef<'_, T, BS<'_, Self::B>>>
fn get_array_ref<T: ByteValued>( &self, offset: usize, n: usize, ) -> Result<VolatileArrayRef<'_, T, BS<'_, Self::B>>>
source§unsafe fn aligned_as_ref<T: ByteValued>(&self, offset: usize) -> Result<&T>
unsafe fn aligned_as_ref<T: ByteValued>(&self, offset: usize) -> Result<&T>
source§unsafe fn aligned_as_mut<T: ByteValued>(&self, offset: usize) -> Result<&mut T>
unsafe fn aligned_as_mut<T: ByteValued>(&self, offset: usize) -> Result<&mut T>
T
at offset
. Mutable accesses performed
using the resulting reference are not automatically accounted for by the dirty bitmap
tracking functionality. Read moresource§fn get_atomic_ref<T: AtomicInteger>(&self, offset: usize) -> Result<&T>
fn get_atomic_ref<T: AtomicInteger>(&self, offset: usize) -> Result<&T>
T
at offset
. Mutable accesses performed
using the resulting reference are not automatically accounted for by the dirty bitmap
tracking functionality. Read moreimpl<'a, B: Copy> Copy for VolatileSlice<'a, B>
Auto Trait Implementations§
impl<'a, B> Freeze for VolatileSlice<'a, B>where
B: Freeze,
impl<'a, B> RefUnwindSafe for VolatileSlice<'a, B>where
B: RefUnwindSafe,
impl<'a, B = ()> !Send for VolatileSlice<'a, B>
impl<'a, B = ()> !Sync for VolatileSlice<'a, B>
impl<'a, B> Unpin for VolatileSlice<'a, B>where
B: Unpin,
impl<'a, B> UnwindSafe for VolatileSlice<'a, B>where
B: UnwindSafe,
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
source§default unsafe fn clone_to_uninit(&self, dst: *mut T)
default unsafe fn clone_to_uninit(&self, dst: *mut T)
clone_to_uninit
)source§impl<T> CloneToUninit for Twhere
T: Copy,
impl<T> CloneToUninit for Twhere
T: Copy,
source§unsafe fn clone_to_uninit(&self, dst: *mut T)
unsafe fn clone_to_uninit(&self, dst: *mut T)
clone_to_uninit
)