JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

Before

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨e432306

⁨giterated-abi/src/lib.rs⁩ - ⁨17837⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 pub mod callback;
85 mod future;
86 pub mod heap;
87 pub mod model_impl;
88 pub mod result;
89 pub mod vtable;
90 use abi_backing::{HeapValueBacking, SliceBacking};
91 pub use future::{FfiFuture, RuntimeFuturePoll};
92 use heap::HeapPlacable;
93
94 use std::{
95 marker::PhantomData,
96 mem::{transmute, MaybeUninit},
97 ops::{Deref, DerefMut},
98 };
99
100 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
101 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
102
103 #[doc(hidden)]
104 pub mod prelude {
105 pub use crate::Ffi;
106 pub use crate::StackPinned;
107 pub use crate::*;
108 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
109 }
110
111 /// Slice Reference
112 /// Heap or Stack Placed
113 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
114
115 /// Mutable Slice Reference
116 /// Heap or Stack Placed
117 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
118
119 /// Value Reference
120 /// Heap or Stack Placed
121 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
122
123 /// Mutable Value Reference
124 /// Heap or Stack Placed
125 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
126
127 /// Owned Value
128 /// Heap Placed
129 pub type FfiValue<T> = Ffi<T, Value>;
130
131 /// Owned Slice
132 /// Heap Placed
133 pub type FfiSlice<T> = Ffi<T, Slice>;
134
135 pub mod value_ex {
136 use crate::{abi_types::Value, Ffi};
137
138 pub type FfiValueUntyped = Ffi<(), Value>;
139 pub type FfiValueRefUntyped = Ffi<(), Value>;
140 }
141
142 /// A value passed over FFI, following the Giterated ABI.
143 ///
144 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
145 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
146 /// The [`Ffi`] type is also used to encode ownership information.
147 ///
148 /// # The Pointer
149 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
150 /// [`Ffi`] type signature.
151 #[repr(transparent)]
152 pub struct Ffi<T: ?Sized, ABI> {
153 inner: *const (),
154 _type_marker: PhantomData<T>,
155 _abi_marker: PhantomData<ABI>,
156 }
157
158 impl<T> FfiSlice<T> {
159 #[inline(always)]
160 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
161 unsafe { HeapPinnedSlice::from_raw(self) }
162 }
163 }
164
165 impl<T> Deref for FfiSlice<T> {
166 type Target = [T];
167
168 #[inline(always)]
169 fn deref(&self) -> &Self::Target {
170 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
171 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
172
173 unsafe {
174 core::slice::from_raw_parts(
175 backing.slice as *mut T,
176 usize::try_from(backing.count).unwrap_unchecked(),
177 )
178 }
179 }
180 }
181 impl<T> DerefMut for FfiSlice<T> {
182 #[inline(always)]
183 fn deref_mut(&mut self) -> &mut Self::Target {
184 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
185 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
186
187 unsafe {
188 core::slice::from_raw_parts_mut(
189 backing.slice as *mut T,
190 usize::try_from(backing.count).unwrap_unchecked(),
191 )
192 }
193 }
194 }
195
196 impl<T> FfiSliceRef<T> {}
197
198 impl<T> Deref for FfiSliceRef<[T]> {
199 type Target = [T];
200
201 #[inline(always)]
202 fn deref(&self) -> &Self::Target {
203 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
204
205 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
206
207 unsafe {
208 core::slice::from_raw_parts(
209 backing.slice as *const T,
210 usize::try_from(backing.count).unwrap_unchecked(),
211 )
212 }
213 }
214 }
215
216 impl<T> FfiValueRef<T> {}
217
218 impl<T> Deref for FfiValueRef<T> {
219 type Target = T;
220
221 #[inline(always)]
222 fn deref(&self) -> &Self::Target {
223 let inner: *const T = unsafe { transmute(self.inner) };
224
225 match unsafe { inner.as_ref() } {
226 Some(val) => val,
227 _ => unreachable!(),
228 }
229 }
230 }
231
232 impl<T> Deref for FfiValueMut<T> {
233 type Target = T;
234
235 fn deref(&self) -> &Self::Target {
236 let inner: *mut T = unsafe { transmute(self.inner) };
237
238 unsafe { inner.as_ref().unwrap_unchecked() }
239 }
240 }
241 impl<T> DerefMut for FfiValueMut<T> {
242 fn deref_mut(&mut self) -> &mut Self::Target {
243 let inner: *mut T = unsafe { transmute(self.inner) };
244
245 unsafe { inner.as_mut().unwrap_unchecked() }
246 }
247 }
248
249 impl<T> std::fmt::Display for FfiValueRef<T>
250 where
251 T: std::fmt::Display,
252 {
253 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
254 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
255 }
256 }
257
258 impl<T> FfiValue<T> {
259 pub fn new(value: T) -> Self {
260 let value = Box::new(HeapValueBacking {
261 value,
262 drop_fn: <T as HeapPlacable>::free,
263 });
264
265 FfiValue {
266 inner: Box::into_raw(value) as _,
267 _type_marker: PhantomData,
268 _abi_marker: PhantomData,
269 }
270 }
271
272 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
273 unsafe { HeapPinnedValue::from_raw(self) }
274 }
275
276 pub fn take(self) -> T {
277 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
278 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
279 // a value onto the stack works.
280
281 let inner = self.inner as *mut T;
282 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
283
284 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
285
286 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
287
288 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
289
290 unsafe { move_target.assume_init() }
291 }
292 }
293
294 impl<T> Deref for FfiValue<T> {
295 type Target = T;
296
297 #[inline(always)]
298 fn deref(&self) -> &Self::Target {
299 let inner: *const T = unsafe { transmute(self.inner) };
300
301 unsafe { inner.as_ref().unwrap_unchecked() }
302 }
303 }
304 impl<T> DerefMut for FfiValue<T> {
305 #[inline(always)]
306 fn deref_mut(&mut self) -> &mut Self::Target {
307 let inner: *mut T = unsafe { transmute(self.inner) };
308
309 unsafe { inner.as_mut().unwrap_unchecked() }
310 }
311 }
312
313 mod abi_backing {
314 use std::{marker::PhantomData, mem::transmute};
315
316 use crate::{FfiSlice, FfiValue};
317
318 #[repr(C)]
319 pub struct HeapValueBacking<T: Sized> {
320 pub(super) value: T,
321 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
322 }
323
324 pub struct SliceBacking<T: ?Sized> {
325 pub(crate) count: u64,
326 pub(crate) slice: *const (),
327 _marker: PhantomData<T>,
328 }
329
330 impl<T: ?Sized> SliceBacking<T> {
331 /// Creates a new slice backing from a raw slice pointer and a count.
332 ///
333 /// # SAFETY
334 ///
335 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
336 /// value provided as `count`.
337 #[inline(always)]
338 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
339 Self {
340 count,
341 slice,
342 _marker: PhantomData,
343 }
344 }
345
346 /// Creates a new slice backing from an [`FfiSlice`].
347 ///
348 /// # SAFETY
349 ///
350 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
351 #[inline(always)]
352 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
353 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
354
355 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
356
357 Self {
358 count: heap_backing.count,
359 slice: heap_backing.slice,
360 _marker: PhantomData,
361 }
362 }
363 }
364 }
365
366 mod guards {
367 use std::marker::PhantomData;
368
369 use crate::{
370 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
371 FfiValueRef,
372 };
373
374 #[repr(transparent)]
375 pub struct StackPinnedSlice<'v, T: ?Sized> {
376 _lifetime: PhantomData<&'v T>,
377 slice: SliceBacking<T>,
378 }
379
380 impl<'v, T> StackPinnedSlice<'v, T> {
381 #[inline(always)]
382 pub fn as_ref(&self) -> FfiSliceRef<T> {
383 FfiSliceRef {
384 inner: &self.slice as *const _ as *const (),
385 _type_marker: PhantomData,
386 _abi_marker: PhantomData,
387 }
388 }
389
390 #[inline(always)]
391 pub fn as_mut(&mut self) -> FfiSliceMut<T> {
392 FfiSliceMut {
393 inner: &mut self.slice as *mut _ as *mut (),
394 _type_marker: PhantomData,
395 _abi_marker: PhantomData,
396 }
397 }
398 }
399
400 impl<'v, T> StackPinnedSlice<'v, T> {
401 /// Creates a stack pinned slice guard from a borrowed slice.
402 ///
403 /// # SAFETY
404 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
405 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
406 /// though.
407 #[inline(always)]
408 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
409 Self {
410 _lifetime: PhantomData,
411 slice: SliceBacking::from_raw(
412 u64::try_from(slice.len()).unwrap(),
413 slice.as_ptr() as *const (),
414 ),
415 }
416 }
417 }
418
419 pub struct StackPinnedValue<'v, T> {
420 value_ref: &'v T,
421 }
422
423 impl<'v, T> StackPinnedValue<'v, T> {
424 /// Grants a reference to the pinned value.
425 ///
426 /// # SAFETY
427 /// - The granted reference **must not** outlive the lifetime of `&self`.
428 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
429 #[inline(always)]
430 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
431 Ffi {
432 inner: self.value_ref as *const _ as *const (),
433 _type_marker: PhantomData,
434 _abi_marker: PhantomData,
435 }
436 }
437 }
438
439 impl<'v, T> StackPinnedValue<'v, T> {
440 #[inline(always)]
441 pub(crate) fn from_raw(value: &'v T) -> Self {
442 Self { value_ref: value }
443 }
444 }
445
446 pub struct HeapPinnedSlice<'v, T> {
447 _lifetime: PhantomData<&'v T>,
448 slice: SliceBacking<T>,
449 }
450
451 impl<'v, T> HeapPinnedSlice<'v, T> {
452 /// Creates a pin guard from a heap placed slice.
453 ///
454 /// # SAFETY
455 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
456 /// of the returned [`HeapPinnedSlice`] guard.
457 #[inline(always)]
458 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
459 Self {
460 _lifetime: PhantomData,
461 slice: SliceBacking::from_heap(slice),
462 }
463 }
464
465 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
466 FfiSliceRef {
467 inner: &self.slice as *const _ as *const (),
468 _type_marker: PhantomData,
469 _abi_marker: PhantomData,
470 }
471 }
472
473 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
474 FfiSliceMut {
475 inner: &mut self.slice as *mut _ as *mut (),
476 _type_marker: PhantomData,
477 _abi_marker: PhantomData,
478 }
479 }
480 }
481
482 #[repr(transparent)]
483 pub struct HeapPinnedValue<'v, T> {
484 value: &'v FfiValue<T>,
485 }
486
487 impl<'v, T> HeapPinnedValue<'v, T> {
488 #[inline(always)]
489 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
490 Self { value }
491 }
492
493 #[inline(always)]
494 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
495 FfiValueRef {
496 inner: self.value.inner,
497 _type_marker: PhantomData,
498 _abi_marker: PhantomData,
499 }
500 }
501
502 #[inline(always)]
503 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
504 FfiValueMut {
505 inner: self.value.inner,
506 _type_marker: PhantomData,
507 _abi_marker: PhantomData,
508 }
509 }
510 }
511 }
512
513 mod abi_types {
514 pub struct Slice;
515
516 pub struct SliceRef;
517
518 pub struct SliceMut;
519
520 pub struct ValueRef;
521
522 pub struct ValueMut;
523
524 pub struct Value;
525 }
526
527 pub trait StackPinned<'p> {
528 type Pinned: ?Sized + 'p;
529
530 fn pin(&'p self) -> Self::Pinned;
531 }
532
533 impl<'p, T: 'p> StackPinned<'p> for [T] {
534 type Pinned = StackPinnedSlice<'p, T>;
535
536 #[inline(always)]
537 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
538 unsafe { StackPinnedSlice::from_raw(self) }
539 }
540 }
541
542 impl<'p, T: 'p> StackPinned<'p> for T {
543 type Pinned = StackPinnedValue<'p, T>;
544
545 #[inline(always)]
546 fn pin(&'p self) -> Self::Pinned {
547 StackPinnedValue::from_raw(self)
548 }
549 }
550