JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

Spinning

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨1788060

⁨giterated-runtime/giterated-abi/src/lib.rs⁩ - ⁨18018⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 pub mod callback;
85 mod future;
86 pub mod heap;
87 pub mod model_impl;
88 pub mod operation;
89 pub mod plugin;
90 pub mod result;
91 pub mod state;
92 pub mod vtable;
93 use abi_backing::{HeapValueBacking, SliceBacking};
94 pub use future::{FfiFuture, RuntimeFuturePoll};
95 use heap::HeapPlacable;
96 use prelude::value_ex::FfiValueUntyped;
97
98 use std::{
99 marker::PhantomData,
100 mem::{transmute, MaybeUninit},
101 ops::{Deref, DerefMut},
102 };
103
104 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
105 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
106
107 #[doc(hidden)]
108 pub mod prelude {
109 pub use crate::Ffi;
110 pub use crate::StackPinned;
111 pub use crate::*;
112 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
113 }
114
115 /// Slice Reference
116 /// Heap or Stack Placed
117 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
118
119 /// Mutable Slice Reference
120 /// Heap or Stack Placed
121 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
122
123 /// Value Reference
124 /// Heap or Stack Placed
125 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
126
127 /// Mutable Value Reference
128 /// Heap or Stack Placed
129 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
130
131 /// Owned Value
132 /// Heap Placed
133 pub type FfiValue<T> = Ffi<T, Value>;
134
135 /// Owned Slice
136 /// Heap Placed
137 pub type FfiSlice<T> = Ffi<T, Slice>;
138
139 pub mod value_ex {
140 use crate::{abi_types::Value, Ffi};
141
142 pub type FfiValueUntyped = Ffi<(), Value>;
143 pub type FfiValueRefUntyped = Ffi<(), Value>;
144 }
145
146 /// A value passed over FFI, following the Giterated ABI.
147 ///
148 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
149 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
150 /// The [`Ffi`] type is also used to encode ownership information.
151 ///
152 /// # The Pointer
153 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
154 /// [`Ffi`] type signature.
155 #[repr(transparent)]
156 pub struct Ffi<T: ?Sized, ABI> {
157 inner: *const (),
158 _type_marker: PhantomData<T>,
159 _abi_marker: PhantomData<ABI>,
160 }
161
162 impl<T> FfiSlice<T> {
163 #[inline(always)]
164 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
165 unsafe { HeapPinnedSlice::from_raw(self) }
166 }
167 }
168
169 impl<T> Deref for FfiSlice<T> {
170 type Target = [T];
171
172 #[inline(always)]
173 fn deref(&self) -> &Self::Target {
174 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
175 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
176
177 unsafe {
178 core::slice::from_raw_parts(
179 backing.slice as *mut T,
180 usize::try_from(backing.count).unwrap_unchecked(),
181 )
182 }
183 }
184 }
185 impl<T> DerefMut for FfiSlice<T> {
186 #[inline(always)]
187 fn deref_mut(&mut self) -> &mut Self::Target {
188 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
189 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
190
191 unsafe {
192 core::slice::from_raw_parts_mut(
193 backing.slice as *mut T,
194 usize::try_from(backing.count).unwrap_unchecked(),
195 )
196 }
197 }
198 }
199
200 impl<T> FfiSliceRef<T> {}
201
202 impl<T> Deref for FfiSliceRef<[T]> {
203 type Target = [T];
204
205 #[inline(always)]
206 fn deref(&self) -> &Self::Target {
207 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
208
209 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
210
211 unsafe {
212 core::slice::from_raw_parts(
213 backing.slice as *const T,
214 usize::try_from(backing.count).unwrap_unchecked(),
215 )
216 }
217 }
218 }
219
220 impl<T> FfiValueRef<T> {}
221
222 impl<T> Deref for FfiValueRef<T> {
223 type Target = T;
224
225 #[inline(always)]
226 fn deref(&self) -> &Self::Target {
227 let inner: *const T = unsafe { transmute(self.inner) };
228
229 match unsafe { inner.as_ref() } {
230 Some(val) => val,
231 _ => unreachable!(),
232 }
233 }
234 }
235
236 impl<T> Deref for FfiValueMut<T> {
237 type Target = T;
238
239 fn deref(&self) -> &Self::Target {
240 let inner: *mut T = unsafe { transmute(self.inner) };
241
242 unsafe { inner.as_ref().unwrap_unchecked() }
243 }
244 }
245 impl<T> DerefMut for FfiValueMut<T> {
246 fn deref_mut(&mut self) -> &mut Self::Target {
247 let inner: *mut T = unsafe { transmute(self.inner) };
248
249 unsafe { inner.as_mut().unwrap_unchecked() }
250 }
251 }
252
253 impl<T> std::fmt::Display for FfiValueRef<T>
254 where
255 T: std::fmt::Display,
256 {
257 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
258 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
259 }
260 }
261
262 impl<T> FfiValue<T> {
263 pub fn new(value: T) -> Self {
264 let value = Box::new(HeapValueBacking {
265 value,
266 drop_fn: <T as HeapPlacable>::free,
267 });
268
269 FfiValue {
270 inner: Box::into_raw(value) as _,
271 _type_marker: PhantomData,
272 _abi_marker: PhantomData,
273 }
274 }
275
276 pub fn erase_type(self) -> FfiValueUntyped {
277 unsafe { transmute(self) }
278 }
279
280 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
281 unsafe { HeapPinnedValue::from_raw(self) }
282 }
283
284 pub fn take(self) -> T {
285 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
286 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
287 // a value onto the stack works.
288
289 let inner = self.inner as *mut T;
290 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
291
292 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
293
294 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
295
296 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
297
298 unsafe { move_target.assume_init() }
299 }
300 }
301
302 impl<T> Deref for FfiValue<T> {
303 type Target = T;
304
305 #[inline(always)]
306 fn deref(&self) -> &Self::Target {
307 let inner: *const T = unsafe { transmute(self.inner) };
308
309 unsafe { inner.as_ref().unwrap_unchecked() }
310 }
311 }
312 impl<T> DerefMut for FfiValue<T> {
313 #[inline(always)]
314 fn deref_mut(&mut self) -> &mut Self::Target {
315 let inner: *mut T = unsafe { transmute(self.inner) };
316
317 unsafe { inner.as_mut().unwrap_unchecked() }
318 }
319 }
320
321 mod abi_backing {
322 use std::{marker::PhantomData, mem::transmute};
323
324 use crate::{FfiSlice, FfiValue};
325
326 #[repr(C)]
327 pub struct HeapValueBacking<T: Sized> {
328 pub(super) value: T,
329 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
330 }
331
332 pub struct SliceBacking<T: ?Sized> {
333 pub(crate) count: u64,
334 pub(crate) slice: *const (),
335 _marker: PhantomData<T>,
336 }
337
338 impl<T: ?Sized> SliceBacking<T> {
339 /// Creates a new slice backing from a raw slice pointer and a count.
340 ///
341 /// # SAFETY
342 ///
343 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
344 /// value provided as `count`.
345 #[inline(always)]
346 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
347 Self {
348 count,
349 slice,
350 _marker: PhantomData,
351 }
352 }
353
354 /// Creates a new slice backing from an [`FfiSlice`].
355 ///
356 /// # SAFETY
357 ///
358 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
359 #[inline(always)]
360 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
361 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
362
363 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
364
365 Self {
366 count: heap_backing.count,
367 slice: heap_backing.slice,
368 _marker: PhantomData,
369 }
370 }
371 }
372 }
373
374 mod guards {
375 use std::marker::PhantomData;
376
377 use crate::{
378 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
379 FfiValueRef,
380 };
381
382 #[repr(transparent)]
383 pub struct StackPinnedSlice<'v, T: ?Sized> {
384 _lifetime: PhantomData<&'v T>,
385 slice: SliceBacking<T>,
386 }
387
388 impl<'v, T> StackPinnedSlice<'v, T> {
389 #[inline(always)]
390 pub fn as_ref(&self) -> FfiSliceRef<T> {
391 FfiSliceRef {
392 inner: &self.slice as *const _ as *const (),
393 _type_marker: PhantomData,
394 _abi_marker: PhantomData,
395 }
396 }
397
398 #[inline(always)]
399 pub fn as_mut(&mut self) -> FfiSliceMut<T> {
400 FfiSliceMut {
401 inner: &mut self.slice as *mut _ as *mut (),
402 _type_marker: PhantomData,
403 _abi_marker: PhantomData,
404 }
405 }
406 }
407
408 impl<'v, T> StackPinnedSlice<'v, T> {
409 /// Creates a stack pinned slice guard from a borrowed slice.
410 ///
411 /// # SAFETY
412 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
413 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
414 /// though.
415 #[inline(always)]
416 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
417 Self {
418 _lifetime: PhantomData,
419 slice: SliceBacking::from_raw(
420 u64::try_from(slice.len()).unwrap(),
421 slice.as_ptr() as *const (),
422 ),
423 }
424 }
425 }
426
427 pub struct StackPinnedValue<'v, T> {
428 value_ref: &'v T,
429 }
430
431 impl<'v, T> StackPinnedValue<'v, T> {
432 /// Grants a reference to the pinned value.
433 ///
434 /// # SAFETY
435 /// - The granted reference **must not** outlive the lifetime of `&self`.
436 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
437 #[inline(always)]
438 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
439 Ffi {
440 inner: self.value_ref as *const _ as *const (),
441 _type_marker: PhantomData,
442 _abi_marker: PhantomData,
443 }
444 }
445 }
446
447 impl<'v, T> StackPinnedValue<'v, T> {
448 #[inline(always)]
449 pub(crate) fn from_raw(value: &'v T) -> Self {
450 Self { value_ref: value }
451 }
452 }
453
454 pub struct HeapPinnedSlice<'v, T> {
455 _lifetime: PhantomData<&'v T>,
456 slice: SliceBacking<T>,
457 }
458
459 impl<'v, T> HeapPinnedSlice<'v, T> {
460 /// Creates a pin guard from a heap placed slice.
461 ///
462 /// # SAFETY
463 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
464 /// of the returned [`HeapPinnedSlice`] guard.
465 #[inline(always)]
466 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
467 Self {
468 _lifetime: PhantomData,
469 slice: SliceBacking::from_heap(slice),
470 }
471 }
472
473 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
474 FfiSliceRef {
475 inner: &self.slice as *const _ as *const (),
476 _type_marker: PhantomData,
477 _abi_marker: PhantomData,
478 }
479 }
480
481 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
482 FfiSliceMut {
483 inner: &mut self.slice as *mut _ as *mut (),
484 _type_marker: PhantomData,
485 _abi_marker: PhantomData,
486 }
487 }
488 }
489
490 #[repr(transparent)]
491 pub struct HeapPinnedValue<'v, T> {
492 value: &'v FfiValue<T>,
493 }
494
495 impl<'v, T> HeapPinnedValue<'v, T> {
496 #[inline(always)]
497 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
498 Self { value }
499 }
500
501 #[inline(always)]
502 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
503 FfiValueRef {
504 inner: self.value.inner,
505 _type_marker: PhantomData,
506 _abi_marker: PhantomData,
507 }
508 }
509
510 #[inline(always)]
511 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
512 FfiValueMut {
513 inner: self.value.inner,
514 _type_marker: PhantomData,
515 _abi_marker: PhantomData,
516 }
517 }
518 }
519 }
520
521 mod abi_types {
522 pub struct Slice;
523
524 pub struct SliceRef;
525
526 pub struct SliceMut;
527
528 pub struct ValueRef;
529
530 pub struct ValueMut;
531
532 pub struct Value;
533 }
534
535 pub trait StackPinned<'p> {
536 type Pinned: ?Sized + 'p;
537
538 fn pin(&'p self) -> Self::Pinned;
539 }
540
541 impl<'p, T: 'p> StackPinned<'p> for [T] {
542 type Pinned = StackPinnedSlice<'p, T>;
543
544 #[inline(always)]
545 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
546 unsafe { StackPinnedSlice::from_raw(self) }
547 }
548 }
549
550 impl<'p, T: 'p> StackPinned<'p> for T {
551 type Pinned = StackPinnedValue<'p, T>;
552
553 #[inline(always)]
554 fn pin(&'p self) -> Self::Pinned {
555 StackPinnedValue::from_raw(self)
556 }
557 }
558