JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

More restructuring

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨10b7b7c

⁨giterated-runtime/giterated-abi/src/lib.rs⁩ - ⁨17999⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 pub mod callback;
85 mod future;
86 pub mod heap;
87 pub mod model_impl;
88 pub mod plugin;
89 pub mod result;
90 pub mod state;
91 pub mod vtable;
92 use abi_backing::{HeapValueBacking, SliceBacking};
93 pub use future::{FfiFuture, RuntimeFuturePoll};
94 use heap::HeapPlacable;
95 use prelude::value_ex::FfiValueUntyped;
96
97 use std::{
98 marker::PhantomData,
99 mem::{transmute, MaybeUninit},
100 ops::{Deref, DerefMut},
101 };
102
103 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
104 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
105
106 #[doc(hidden)]
107 pub mod prelude {
108 pub use crate::Ffi;
109 pub use crate::StackPinned;
110 pub use crate::*;
111 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
112 }
113
114 /// Slice Reference
115 /// Heap or Stack Placed
116 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
117
118 /// Mutable Slice Reference
119 /// Heap or Stack Placed
120 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
121
122 /// Value Reference
123 /// Heap or Stack Placed
124 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
125
126 /// Mutable Value Reference
127 /// Heap or Stack Placed
128 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
129
130 /// Owned Value
131 /// Heap Placed
132 pub type FfiValue<T> = Ffi<T, Value>;
133
134 /// Owned Slice
135 /// Heap Placed
136 pub type FfiSlice<T> = Ffi<T, Slice>;
137
138 pub mod value_ex {
139 use crate::{abi_types::Value, Ffi};
140
141 pub type FfiValueUntyped = Ffi<(), Value>;
142 pub type FfiValueRefUntyped = Ffi<(), Value>;
143 }
144
145 /// A value passed over FFI, following the Giterated ABI.
146 ///
147 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
148 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
149 /// The [`Ffi`] type is also used to encode ownership information.
150 ///
151 /// # The Pointer
152 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
153 /// [`Ffi`] type signature.
154 #[repr(transparent)]
155 pub struct Ffi<T: ?Sized, ABI> {
156 inner: *const (),
157 _type_marker: PhantomData<T>,
158 _abi_marker: PhantomData<ABI>,
159 }
160
161 impl<T> FfiSlice<T> {
162 #[inline(always)]
163 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
164 unsafe { HeapPinnedSlice::from_raw(self) }
165 }
166 }
167
168 impl<T> Deref for FfiSlice<T> {
169 type Target = [T];
170
171 #[inline(always)]
172 fn deref(&self) -> &Self::Target {
173 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
174 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
175
176 unsafe {
177 core::slice::from_raw_parts(
178 backing.slice as *mut T,
179 usize::try_from(backing.count).unwrap_unchecked(),
180 )
181 }
182 }
183 }
184 impl<T> DerefMut for FfiSlice<T> {
185 #[inline(always)]
186 fn deref_mut(&mut self) -> &mut Self::Target {
187 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
188 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
189
190 unsafe {
191 core::slice::from_raw_parts_mut(
192 backing.slice as *mut T,
193 usize::try_from(backing.count).unwrap_unchecked(),
194 )
195 }
196 }
197 }
198
199 impl<T> FfiSliceRef<T> {}
200
201 impl<T> Deref for FfiSliceRef<[T]> {
202 type Target = [T];
203
204 #[inline(always)]
205 fn deref(&self) -> &Self::Target {
206 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
207
208 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
209
210 unsafe {
211 core::slice::from_raw_parts(
212 backing.slice as *const T,
213 usize::try_from(backing.count).unwrap_unchecked(),
214 )
215 }
216 }
217 }
218
219 impl<T> FfiValueRef<T> {}
220
221 impl<T> Deref for FfiValueRef<T> {
222 type Target = T;
223
224 #[inline(always)]
225 fn deref(&self) -> &Self::Target {
226 let inner: *const T = unsafe { transmute(self.inner) };
227
228 match unsafe { inner.as_ref() } {
229 Some(val) => val,
230 _ => unreachable!(),
231 }
232 }
233 }
234
235 impl<T> Deref for FfiValueMut<T> {
236 type Target = T;
237
238 fn deref(&self) -> &Self::Target {
239 let inner: *mut T = unsafe { transmute(self.inner) };
240
241 unsafe { inner.as_ref().unwrap_unchecked() }
242 }
243 }
244 impl<T> DerefMut for FfiValueMut<T> {
245 fn deref_mut(&mut self) -> &mut Self::Target {
246 let inner: *mut T = unsafe { transmute(self.inner) };
247
248 unsafe { inner.as_mut().unwrap_unchecked() }
249 }
250 }
251
252 impl<T> std::fmt::Display for FfiValueRef<T>
253 where
254 T: std::fmt::Display,
255 {
256 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
257 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
258 }
259 }
260
261 impl<T> FfiValue<T> {
262 pub fn new(value: T) -> Self {
263 let value = Box::new(HeapValueBacking {
264 value,
265 drop_fn: <T as HeapPlacable>::free,
266 });
267
268 FfiValue {
269 inner: Box::into_raw(value) as _,
270 _type_marker: PhantomData,
271 _abi_marker: PhantomData,
272 }
273 }
274
275 pub fn erase_type(self) -> FfiValueUntyped {
276 unsafe { transmute(self) }
277 }
278
279 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
280 unsafe { HeapPinnedValue::from_raw(self) }
281 }
282
283 pub fn take(self) -> T {
284 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
285 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
286 // a value onto the stack works.
287
288 let inner = self.inner as *mut T;
289 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
290
291 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
292
293 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
294
295 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
296
297 unsafe { move_target.assume_init() }
298 }
299 }
300
301 impl<T> Deref for FfiValue<T> {
302 type Target = T;
303
304 #[inline(always)]
305 fn deref(&self) -> &Self::Target {
306 let inner: *const T = unsafe { transmute(self.inner) };
307
308 unsafe { inner.as_ref().unwrap_unchecked() }
309 }
310 }
311 impl<T> DerefMut for FfiValue<T> {
312 #[inline(always)]
313 fn deref_mut(&mut self) -> &mut Self::Target {
314 let inner: *mut T = unsafe { transmute(self.inner) };
315
316 unsafe { inner.as_mut().unwrap_unchecked() }
317 }
318 }
319
320 mod abi_backing {
321 use std::{marker::PhantomData, mem::transmute};
322
323 use crate::{FfiSlice, FfiValue};
324
325 #[repr(C)]
326 pub struct HeapValueBacking<T: Sized> {
327 pub(super) value: T,
328 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
329 }
330
331 pub struct SliceBacking<T: ?Sized> {
332 pub(crate) count: u64,
333 pub(crate) slice: *const (),
334 _marker: PhantomData<T>,
335 }
336
337 impl<T: ?Sized> SliceBacking<T> {
338 /// Creates a new slice backing from a raw slice pointer and a count.
339 ///
340 /// # SAFETY
341 ///
342 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
343 /// value provided as `count`.
344 #[inline(always)]
345 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
346 Self {
347 count,
348 slice,
349 _marker: PhantomData,
350 }
351 }
352
353 /// Creates a new slice backing from an [`FfiSlice`].
354 ///
355 /// # SAFETY
356 ///
357 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
358 #[inline(always)]
359 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
360 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
361
362 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
363
364 Self {
365 count: heap_backing.count,
366 slice: heap_backing.slice,
367 _marker: PhantomData,
368 }
369 }
370 }
371 }
372
373 mod guards {
374 use std::marker::PhantomData;
375
376 use crate::{
377 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
378 FfiValueRef,
379 };
380
381 #[repr(transparent)]
382 pub struct StackPinnedSlice<'v, T: ?Sized> {
383 _lifetime: PhantomData<&'v T>,
384 slice: SliceBacking<T>,
385 }
386
387 impl<'v, T> StackPinnedSlice<'v, T> {
388 #[inline(always)]
389 pub fn as_ref(&self) -> FfiSliceRef<T> {
390 FfiSliceRef {
391 inner: &self.slice as *const _ as *const (),
392 _type_marker: PhantomData,
393 _abi_marker: PhantomData,
394 }
395 }
396
397 #[inline(always)]
398 pub fn as_mut(&mut self) -> FfiSliceMut<T> {
399 FfiSliceMut {
400 inner: &mut self.slice as *mut _ as *mut (),
401 _type_marker: PhantomData,
402 _abi_marker: PhantomData,
403 }
404 }
405 }
406
407 impl<'v, T> StackPinnedSlice<'v, T> {
408 /// Creates a stack pinned slice guard from a borrowed slice.
409 ///
410 /// # SAFETY
411 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
412 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
413 /// though.
414 #[inline(always)]
415 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
416 Self {
417 _lifetime: PhantomData,
418 slice: SliceBacking::from_raw(
419 u64::try_from(slice.len()).unwrap(),
420 slice.as_ptr() as *const (),
421 ),
422 }
423 }
424 }
425
426 pub struct StackPinnedValue<'v, T> {
427 value_ref: &'v T,
428 }
429
430 impl<'v, T> StackPinnedValue<'v, T> {
431 /// Grants a reference to the pinned value.
432 ///
433 /// # SAFETY
434 /// - The granted reference **must not** outlive the lifetime of `&self`.
435 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
436 #[inline(always)]
437 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
438 Ffi {
439 inner: self.value_ref as *const _ as *const (),
440 _type_marker: PhantomData,
441 _abi_marker: PhantomData,
442 }
443 }
444 }
445
446 impl<'v, T> StackPinnedValue<'v, T> {
447 #[inline(always)]
448 pub(crate) fn from_raw(value: &'v T) -> Self {
449 Self { value_ref: value }
450 }
451 }
452
453 pub struct HeapPinnedSlice<'v, T> {
454 _lifetime: PhantomData<&'v T>,
455 slice: SliceBacking<T>,
456 }
457
458 impl<'v, T> HeapPinnedSlice<'v, T> {
459 /// Creates a pin guard from a heap placed slice.
460 ///
461 /// # SAFETY
462 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
463 /// of the returned [`HeapPinnedSlice`] guard.
464 #[inline(always)]
465 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
466 Self {
467 _lifetime: PhantomData,
468 slice: SliceBacking::from_heap(slice),
469 }
470 }
471
472 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
473 FfiSliceRef {
474 inner: &self.slice as *const _ as *const (),
475 _type_marker: PhantomData,
476 _abi_marker: PhantomData,
477 }
478 }
479
480 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
481 FfiSliceMut {
482 inner: &mut self.slice as *mut _ as *mut (),
483 _type_marker: PhantomData,
484 _abi_marker: PhantomData,
485 }
486 }
487 }
488
489 #[repr(transparent)]
490 pub struct HeapPinnedValue<'v, T> {
491 value: &'v FfiValue<T>,
492 }
493
494 impl<'v, T> HeapPinnedValue<'v, T> {
495 #[inline(always)]
496 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
497 Self { value }
498 }
499
500 #[inline(always)]
501 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
502 FfiValueRef {
503 inner: self.value.inner,
504 _type_marker: PhantomData,
505 _abi_marker: PhantomData,
506 }
507 }
508
509 #[inline(always)]
510 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
511 FfiValueMut {
512 inner: self.value.inner,
513 _type_marker: PhantomData,
514 _abi_marker: PhantomData,
515 }
516 }
517 }
518 }
519
520 mod abi_types {
521 pub struct Slice;
522
523 pub struct SliceRef;
524
525 pub struct SliceMut;
526
527 pub struct ValueRef;
528
529 pub struct ValueMut;
530
531 pub struct Value;
532 }
533
534 pub trait StackPinned<'p> {
535 type Pinned: ?Sized + 'p;
536
537 fn pin(&'p self) -> Self::Pinned;
538 }
539
540 impl<'p, T: 'p> StackPinned<'p> for [T] {
541 type Pinned = StackPinnedSlice<'p, T>;
542
543 #[inline(always)]
544 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
545 unsafe { StackPinnedSlice::from_raw(self) }
546 }
547 }
548
549 impl<'p, T: 'p> StackPinned<'p> for T {
550 type Pinned = StackPinnedValue<'p, T>;
551
552 #[inline(always)]
553 fn pin(&'p self) -> Self::Pinned {
554 StackPinnedValue::from_raw(self)
555 }
556 }
557