JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

Pre vtable changes

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨d17a4b2

⁨giterated-abi/src/lib.rs⁩ - ⁨17773⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 mod future;
85 mod heap;
86 pub mod result;
87 pub mod vtable;
88 use abi_backing::{HeapValueBacking, SliceBacking};
89 pub use future::{FfiFuture, RuntimeFuturePoll};
90 use heap::HeapPlacable;
91
92 use std::{
93 marker::PhantomData,
94 mem::{transmute, MaybeUninit},
95 ops::{Deref, DerefMut},
96 };
97
98 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
99 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
100
101 #[doc(hidden)]
102 pub mod prelude {
103 pub use crate::Ffi;
104 pub use crate::StackPinned;
105 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
106 }
107
108 /// Slice Reference
109 /// Heap or Stack Placed
110 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
111
112 /// Mutable Slice Reference
113 /// Heap or Stack Placed
114 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
115
116 /// Value Reference
117 /// Heap or Stack Placed
118 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
119
120 /// Mutable Value Reference
121 /// Heap or Stack Placed
122 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
123
124 /// Owned Value
125 /// Heap Placed
126 pub type FfiValue<T> = Ffi<T, Value>;
127
128 /// Owned Slice
129 /// Heap Placed
130 pub type FfiSlice<T> = Ffi<T, Slice>;
131
132 pub mod value_ex {
133 use crate::{abi_types::Value, Ffi};
134
135 pub type FfiValueUntyped = Ffi<(), Value>;
136 pub type FfiValueRefUntyped = Ffi<(), Value>;
137 }
138
139 /// A value passed over FFI, following the Giterated ABI.
140 ///
141 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
142 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
143 /// The [`Ffi`] type is also used to encode ownership information.
144 ///
145 /// # The Pointer
146 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
147 /// [`Ffi`] type signature.
148 #[repr(transparent)]
149 pub struct Ffi<T: ?Sized, ABI> {
150 inner: *const (),
151 _type_marker: PhantomData<T>,
152 _abi_marker: PhantomData<ABI>,
153 }
154
155 impl<T> FfiSlice<T> {
156 #[inline(always)]
157 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
158 unsafe { HeapPinnedSlice::from_raw(self) }
159 }
160 }
161
162 impl<T> Deref for FfiSlice<T> {
163 type Target = [T];
164
165 #[inline(always)]
166 fn deref(&self) -> &Self::Target {
167 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
168 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
169
170 unsafe {
171 core::slice::from_raw_parts(
172 backing.slice as *mut T,
173 usize::try_from(backing.count).unwrap_unchecked(),
174 )
175 }
176 }
177 }
178 impl<T> DerefMut for FfiSlice<T> {
179 #[inline(always)]
180 fn deref_mut(&mut self) -> &mut Self::Target {
181 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
182 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
183
184 unsafe {
185 core::slice::from_raw_parts_mut(
186 backing.slice as *mut T,
187 usize::try_from(backing.count).unwrap_unchecked(),
188 )
189 }
190 }
191 }
192
193 impl<T> FfiSliceRef<T> {}
194
195 impl<T> Deref for FfiSliceRef<[T]> {
196 type Target = [T];
197
198 #[inline(always)]
199 fn deref(&self) -> &Self::Target {
200 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
201
202 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
203
204 unsafe {
205 core::slice::from_raw_parts(
206 backing.slice as *const T,
207 usize::try_from(backing.count).unwrap_unchecked(),
208 )
209 }
210 }
211 }
212
213 impl<T> FfiValueRef<T> {}
214
215 impl<T> Deref for FfiValueRef<T> {
216 type Target = T;
217
218 #[inline(always)]
219 fn deref(&self) -> &Self::Target {
220 let inner: *const T = unsafe { transmute(self.inner) };
221
222 match unsafe { inner.as_ref() } {
223 Some(val) => val,
224 _ => unreachable!(),
225 }
226 }
227 }
228
229 impl<T> Deref for FfiValueMut<T> {
230 type Target = T;
231
232 fn deref(&self) -> &Self::Target {
233 let inner: *mut T = unsafe { transmute(self.inner) };
234
235 unsafe { inner.as_ref().unwrap_unchecked() }
236 }
237 }
238 impl<T> DerefMut for FfiValueMut<T> {
239 fn deref_mut(&mut self) -> &mut Self::Target {
240 let inner: *mut T = unsafe { transmute(self.inner) };
241
242 unsafe { inner.as_mut().unwrap_unchecked() }
243 }
244 }
245
246 impl<T> std::fmt::Display for FfiValueRef<T>
247 where
248 T: std::fmt::Display,
249 {
250 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
251 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
252 }
253 }
254
255 impl<T> FfiValue<T> {
256 pub fn new(value: T) -> Self {
257 let value = Box::new(HeapValueBacking {
258 value,
259 drop_fn: <T as HeapPlacable>::free,
260 });
261
262 FfiValue {
263 inner: Box::into_raw(value) as _,
264 _type_marker: PhantomData,
265 _abi_marker: PhantomData,
266 }
267 }
268
269 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
270 unsafe { HeapPinnedValue::from_raw(self) }
271 }
272
273 pub fn take(self) -> T {
274 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
275 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
276 // a value onto the stack works.
277
278 let inner = self.inner as *mut T;
279 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
280
281 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
282
283 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
284
285 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
286
287 unsafe { move_target.assume_init() }
288 }
289 }
290
291 impl<T> Deref for FfiValue<T> {
292 type Target = T;
293
294 #[inline(always)]
295 fn deref(&self) -> &Self::Target {
296 let inner: *const T = unsafe { transmute(self.inner) };
297
298 unsafe { inner.as_ref().unwrap_unchecked() }
299 }
300 }
301 impl<T> DerefMut for FfiValue<T> {
302 #[inline(always)]
303 fn deref_mut(&mut self) -> &mut Self::Target {
304 let inner: *mut T = unsafe { transmute(self.inner) };
305
306 unsafe { inner.as_mut().unwrap_unchecked() }
307 }
308 }
309
310 mod abi_backing {
311 use std::{marker::PhantomData, mem::transmute};
312
313 use crate::{FfiSlice, FfiValue};
314
315 #[repr(C)]
316 pub struct HeapValueBacking<T: Sized> {
317 pub(super) value: T,
318 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
319 }
320
321 pub struct SliceBacking<T: ?Sized> {
322 pub(crate) count: u64,
323 pub(crate) slice: *const (),
324 _marker: PhantomData<T>,
325 }
326
327 impl<T: ?Sized> SliceBacking<T> {
328 /// Creates a new slice backing from a raw slice pointer and a count.
329 ///
330 /// # SAFETY
331 ///
332 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
333 /// value provided as `count`.
334 #[inline(always)]
335 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
336 Self {
337 count,
338 slice,
339 _marker: PhantomData,
340 }
341 }
342
343 /// Creates a new slice backing from an [`FfiSlice`].
344 ///
345 /// # SAFETY
346 ///
347 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
348 #[inline(always)]
349 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
350 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
351
352 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
353
354 Self {
355 count: heap_backing.count,
356 slice: heap_backing.slice,
357 _marker: PhantomData,
358 }
359 }
360 }
361 }
362
363 mod guards {
364 use std::marker::PhantomData;
365
366 use crate::{
367 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
368 FfiValueRef,
369 };
370
371 #[repr(transparent)]
372 pub struct StackPinnedSlice<'v, T: ?Sized> {
373 _lifetime: PhantomData<&'v T>,
374 slice: SliceBacking<T>,
375 }
376
377 impl<'v, T> StackPinnedSlice<'v, T> {
378 #[inline(always)]
379 pub fn as_ref(&self) -> FfiSliceRef<T> {
380 FfiSliceRef {
381 inner: &self.slice as *const _ as *const (),
382 _type_marker: PhantomData,
383 _abi_marker: PhantomData,
384 }
385 }
386
387 #[inline(always)]
388 pub fn as_mut(&mut self) -> FfiSliceMut<T> {
389 FfiSliceMut {
390 inner: &mut self.slice as *mut _ as *mut (),
391 _type_marker: PhantomData,
392 _abi_marker: PhantomData,
393 }
394 }
395 }
396
397 impl<'v, T> StackPinnedSlice<'v, T> {
398 /// Creates a stack pinned slice guard from a borrowed slice.
399 ///
400 /// # SAFETY
401 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
402 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
403 /// though.
404 #[inline(always)]
405 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
406 Self {
407 _lifetime: PhantomData,
408 slice: SliceBacking::from_raw(
409 u64::try_from(slice.len()).unwrap(),
410 slice.as_ptr() as *const (),
411 ),
412 }
413 }
414 }
415
416 pub struct StackPinnedValue<'v, T> {
417 value_ref: &'v T,
418 }
419
420 impl<'v, T> StackPinnedValue<'v, T> {
421 /// Grants a reference to the pinned value.
422 ///
423 /// # SAFETY
424 /// - The granted reference **must not** outlive the lifetime of `&self`.
425 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
426 #[inline(always)]
427 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
428 Ffi {
429 inner: self.value_ref as *const _ as *const (),
430 _type_marker: PhantomData,
431 _abi_marker: PhantomData,
432 }
433 }
434 }
435
436 impl<'v, T> StackPinnedValue<'v, T> {
437 #[inline(always)]
438 pub(crate) fn from_raw(value: &'v T) -> Self {
439 Self { value_ref: value }
440 }
441 }
442
443 pub struct HeapPinnedSlice<'v, T> {
444 _lifetime: PhantomData<&'v T>,
445 slice: SliceBacking<T>,
446 }
447
448 impl<'v, T> HeapPinnedSlice<'v, T> {
449 /// Creates a pin guard from a heap placed slice.
450 ///
451 /// # SAFETY
452 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
453 /// of the returned [`HeapPinnedSlice`] guard.
454 #[inline(always)]
455 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
456 Self {
457 _lifetime: PhantomData,
458 slice: SliceBacking::from_heap(slice),
459 }
460 }
461
462 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
463 FfiSliceRef {
464 inner: &self.slice as *const _ as *const (),
465 _type_marker: PhantomData,
466 _abi_marker: PhantomData,
467 }
468 }
469
470 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
471 FfiSliceMut {
472 inner: &mut self.slice as *mut _ as *mut (),
473 _type_marker: PhantomData,
474 _abi_marker: PhantomData,
475 }
476 }
477 }
478
479 #[repr(transparent)]
480 pub struct HeapPinnedValue<'v, T> {
481 value: &'v FfiValue<T>,
482 }
483
484 impl<'v, T> HeapPinnedValue<'v, T> {
485 #[inline(always)]
486 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
487 Self { value }
488 }
489
490 #[inline(always)]
491 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
492 FfiValueRef {
493 inner: self.value.inner,
494 _type_marker: PhantomData,
495 _abi_marker: PhantomData,
496 }
497 }
498
499 #[inline(always)]
500 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
501 FfiValueMut {
502 inner: self.value.inner,
503 _type_marker: PhantomData,
504 _abi_marker: PhantomData,
505 }
506 }
507 }
508 }
509
510 mod abi_types {
511 pub struct Slice;
512
513 pub struct SliceRef;
514
515 pub struct SliceMut;
516
517 pub struct ValueRef;
518
519 pub struct ValueMut;
520
521 pub struct Value;
522 }
523
524 pub trait StackPinned<'p> {
525 type Pinned: ?Sized + 'p;
526
527 fn pin(&'p self) -> Self::Pinned;
528 }
529
530 impl<'p, T: 'p> StackPinned<'p> for [T] {
531 type Pinned = StackPinnedSlice<'p, T>;
532
533 #[inline(always)]
534 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
535 unsafe { StackPinnedSlice::from_raw(self) }
536 }
537 }
538
539 impl<'p, T: 'p> StackPinned<'p> for T {
540 type Pinned = StackPinnedValue<'p, T>;
541
542 #[inline(always)]
543 fn pin(&'p self) -> Self::Pinned {
544 StackPinnedValue::from_raw(self)
545 }
546 }
547