JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

insanity

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨6ea28ab

⁨giterated-abi/src/lib.rs⁩ - ⁨17983⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 pub mod callback;
85 mod future;
86 pub mod heap;
87 pub mod model_impl;
88 pub mod result;
89 pub mod state;
90 pub mod vtable;
91 use abi_backing::{HeapValueBacking, SliceBacking};
92 pub use future::{FfiFuture, RuntimeFuturePoll};
93 use heap::HeapPlacable;
94 use prelude::value_ex::FfiValueUntyped;
95
96 use std::{
97 marker::PhantomData,
98 mem::{transmute, MaybeUninit},
99 ops::{Deref, DerefMut},
100 };
101
102 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
103 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
104
105 #[doc(hidden)]
106 pub mod prelude {
107 pub use crate::Ffi;
108 pub use crate::StackPinned;
109 pub use crate::*;
110 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
111 }
112
113 /// Slice Reference
114 /// Heap or Stack Placed
115 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
116
117 /// Mutable Slice Reference
118 /// Heap or Stack Placed
119 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
120
121 /// Value Reference
122 /// Heap or Stack Placed
123 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
124
125 /// Mutable Value Reference
126 /// Heap or Stack Placed
127 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
128
129 /// Owned Value
130 /// Heap Placed
131 pub type FfiValue<T> = Ffi<T, Value>;
132
133 /// Owned Slice
134 /// Heap Placed
135 pub type FfiSlice<T> = Ffi<T, Slice>;
136
137 pub mod value_ex {
138 use crate::{abi_types::Value, Ffi};
139
140 pub type FfiValueUntyped = Ffi<(), Value>;
141 pub type FfiValueRefUntyped = Ffi<(), Value>;
142 }
143
144 /// A value passed over FFI, following the Giterated ABI.
145 ///
146 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
147 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
148 /// The [`Ffi`] type is also used to encode ownership information.
149 ///
150 /// # The Pointer
151 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
152 /// [`Ffi`] type signature.
153 #[repr(transparent)]
154 pub struct Ffi<T: ?Sized, ABI> {
155 inner: *const (),
156 _type_marker: PhantomData<T>,
157 _abi_marker: PhantomData<ABI>,
158 }
159
160 impl<T> FfiSlice<T> {
161 #[inline(always)]
162 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
163 unsafe { HeapPinnedSlice::from_raw(self) }
164 }
165 }
166
167 impl<T> Deref for FfiSlice<T> {
168 type Target = [T];
169
170 #[inline(always)]
171 fn deref(&self) -> &Self::Target {
172 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
173 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
174
175 unsafe {
176 core::slice::from_raw_parts(
177 backing.slice as *mut T,
178 usize::try_from(backing.count).unwrap_unchecked(),
179 )
180 }
181 }
182 }
183 impl<T> DerefMut for FfiSlice<T> {
184 #[inline(always)]
185 fn deref_mut(&mut self) -> &mut Self::Target {
186 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
187 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
188
189 unsafe {
190 core::slice::from_raw_parts_mut(
191 backing.slice as *mut T,
192 usize::try_from(backing.count).unwrap_unchecked(),
193 )
194 }
195 }
196 }
197
198 impl<T> FfiSliceRef<T> {}
199
200 impl<T> Deref for FfiSliceRef<[T]> {
201 type Target = [T];
202
203 #[inline(always)]
204 fn deref(&self) -> &Self::Target {
205 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
206
207 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
208
209 unsafe {
210 core::slice::from_raw_parts(
211 backing.slice as *const T,
212 usize::try_from(backing.count).unwrap_unchecked(),
213 )
214 }
215 }
216 }
217
218 impl<T> FfiValueRef<T> {}
219
220 impl<T> Deref for FfiValueRef<T> {
221 type Target = T;
222
223 #[inline(always)]
224 fn deref(&self) -> &Self::Target {
225 let inner: *const T = unsafe { transmute(self.inner) };
226
227 match unsafe { inner.as_ref() } {
228 Some(val) => val,
229 _ => unreachable!(),
230 }
231 }
232 }
233
234 impl<T> Deref for FfiValueMut<T> {
235 type Target = T;
236
237 fn deref(&self) -> &Self::Target {
238 let inner: *mut T = unsafe { transmute(self.inner) };
239
240 unsafe { inner.as_ref().unwrap_unchecked() }
241 }
242 }
243 impl<T> DerefMut for FfiValueMut<T> {
244 fn deref_mut(&mut self) -> &mut Self::Target {
245 let inner: *mut T = unsafe { transmute(self.inner) };
246
247 unsafe { inner.as_mut().unwrap_unchecked() }
248 }
249 }
250
251 impl<T> std::fmt::Display for FfiValueRef<T>
252 where
253 T: std::fmt::Display,
254 {
255 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
256 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
257 }
258 }
259
260 impl<T> FfiValue<T> {
261 pub fn new(value: T) -> Self {
262 let value = Box::new(HeapValueBacking {
263 value,
264 drop_fn: <T as HeapPlacable>::free,
265 });
266
267 FfiValue {
268 inner: Box::into_raw(value) as _,
269 _type_marker: PhantomData,
270 _abi_marker: PhantomData,
271 }
272 }
273
274 pub fn erase_type(self) -> FfiValueUntyped {
275 unsafe { transmute(self) }
276 }
277
278 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
279 unsafe { HeapPinnedValue::from_raw(self) }
280 }
281
282 pub fn take(self) -> T {
283 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
284 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
285 // a value onto the stack works.
286
287 let inner = self.inner as *mut T;
288 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
289
290 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
291
292 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
293
294 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
295
296 unsafe { move_target.assume_init() }
297 }
298 }
299
300 impl<T> Deref for FfiValue<T> {
301 type Target = T;
302
303 #[inline(always)]
304 fn deref(&self) -> &Self::Target {
305 let inner: *const T = unsafe { transmute(self.inner) };
306
307 unsafe { inner.as_ref().unwrap_unchecked() }
308 }
309 }
310 impl<T> DerefMut for FfiValue<T> {
311 #[inline(always)]
312 fn deref_mut(&mut self) -> &mut Self::Target {
313 let inner: *mut T = unsafe { transmute(self.inner) };
314
315 unsafe { inner.as_mut().unwrap_unchecked() }
316 }
317 }
318
319 mod abi_backing {
320 use std::{marker::PhantomData, mem::transmute};
321
322 use crate::{FfiSlice, FfiValue};
323
324 #[repr(C)]
325 pub struct HeapValueBacking<T: Sized> {
326 pub(super) value: T,
327 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
328 }
329
330 pub struct SliceBacking<T: ?Sized> {
331 pub(crate) count: u64,
332 pub(crate) slice: *const (),
333 _marker: PhantomData<T>,
334 }
335
336 impl<T: ?Sized> SliceBacking<T> {
337 /// Creates a new slice backing from a raw slice pointer and a count.
338 ///
339 /// # SAFETY
340 ///
341 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
342 /// value provided as `count`.
343 #[inline(always)]
344 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
345 Self {
346 count,
347 slice,
348 _marker: PhantomData,
349 }
350 }
351
352 /// Creates a new slice backing from an [`FfiSlice`].
353 ///
354 /// # SAFETY
355 ///
356 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
357 #[inline(always)]
358 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
359 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
360
361 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
362
363 Self {
364 count: heap_backing.count,
365 slice: heap_backing.slice,
366 _marker: PhantomData,
367 }
368 }
369 }
370 }
371
372 mod guards {
373 use std::marker::PhantomData;
374
375 use crate::{
376 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
377 FfiValueRef,
378 };
379
380 #[repr(transparent)]
381 pub struct StackPinnedSlice<'v, T: ?Sized> {
382 _lifetime: PhantomData<&'v T>,
383 slice: SliceBacking<T>,
384 }
385
386 impl<'v, T> StackPinnedSlice<'v, T> {
387 #[inline(always)]
388 pub fn as_ref(&self) -> FfiSliceRef<T> {
389 FfiSliceRef {
390 inner: &self.slice as *const _ as *const (),
391 _type_marker: PhantomData,
392 _abi_marker: PhantomData,
393 }
394 }
395
396 #[inline(always)]
397 pub fn as_mut(&mut self) -> FfiSliceMut<T> {
398 FfiSliceMut {
399 inner: &mut self.slice as *mut _ as *mut (),
400 _type_marker: PhantomData,
401 _abi_marker: PhantomData,
402 }
403 }
404 }
405
406 impl<'v, T> StackPinnedSlice<'v, T> {
407 /// Creates a stack pinned slice guard from a borrowed slice.
408 ///
409 /// # SAFETY
410 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
411 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
412 /// though.
413 #[inline(always)]
414 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
415 Self {
416 _lifetime: PhantomData,
417 slice: SliceBacking::from_raw(
418 u64::try_from(slice.len()).unwrap(),
419 slice.as_ptr() as *const (),
420 ),
421 }
422 }
423 }
424
425 pub struct StackPinnedValue<'v, T> {
426 value_ref: &'v T,
427 }
428
429 impl<'v, T> StackPinnedValue<'v, T> {
430 /// Grants a reference to the pinned value.
431 ///
432 /// # SAFETY
433 /// - The granted reference **must not** outlive the lifetime of `&self`.
434 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
435 #[inline(always)]
436 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
437 Ffi {
438 inner: self.value_ref as *const _ as *const (),
439 _type_marker: PhantomData,
440 _abi_marker: PhantomData,
441 }
442 }
443 }
444
445 impl<'v, T> StackPinnedValue<'v, T> {
446 #[inline(always)]
447 pub(crate) fn from_raw(value: &'v T) -> Self {
448 Self { value_ref: value }
449 }
450 }
451
452 pub struct HeapPinnedSlice<'v, T> {
453 _lifetime: PhantomData<&'v T>,
454 slice: SliceBacking<T>,
455 }
456
457 impl<'v, T> HeapPinnedSlice<'v, T> {
458 /// Creates a pin guard from a heap placed slice.
459 ///
460 /// # SAFETY
461 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
462 /// of the returned [`HeapPinnedSlice`] guard.
463 #[inline(always)]
464 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
465 Self {
466 _lifetime: PhantomData,
467 slice: SliceBacking::from_heap(slice),
468 }
469 }
470
471 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
472 FfiSliceRef {
473 inner: &self.slice as *const _ as *const (),
474 _type_marker: PhantomData,
475 _abi_marker: PhantomData,
476 }
477 }
478
479 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
480 FfiSliceMut {
481 inner: &mut self.slice as *mut _ as *mut (),
482 _type_marker: PhantomData,
483 _abi_marker: PhantomData,
484 }
485 }
486 }
487
488 #[repr(transparent)]
489 pub struct HeapPinnedValue<'v, T> {
490 value: &'v FfiValue<T>,
491 }
492
493 impl<'v, T> HeapPinnedValue<'v, T> {
494 #[inline(always)]
495 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
496 Self { value }
497 }
498
499 #[inline(always)]
500 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
501 FfiValueRef {
502 inner: self.value.inner,
503 _type_marker: PhantomData,
504 _abi_marker: PhantomData,
505 }
506 }
507
508 #[inline(always)]
509 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
510 FfiValueMut {
511 inner: self.value.inner,
512 _type_marker: PhantomData,
513 _abi_marker: PhantomData,
514 }
515 }
516 }
517 }
518
519 mod abi_types {
520 pub struct Slice;
521
522 pub struct SliceRef;
523
524 pub struct SliceMut;
525
526 pub struct ValueRef;
527
528 pub struct ValueMut;
529
530 pub struct Value;
531 }
532
533 pub trait StackPinned<'p> {
534 type Pinned: ?Sized + 'p;
535
536 fn pin(&'p self) -> Self::Pinned;
537 }
538
539 impl<'p, T: 'p> StackPinned<'p> for [T] {
540 type Pinned = StackPinnedSlice<'p, T>;
541
542 #[inline(always)]
543 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
544 unsafe { StackPinnedSlice::from_raw(self) }
545 }
546 }
547
548 impl<'p, T: 'p> StackPinned<'p> for T {
549 type Pinned = StackPinnedValue<'p, T>;
550
551 #[inline(always)]
552 fn pin(&'p self) -> Self::Pinned {
553 StackPinnedValue::from_raw(self)
554 }
555 }
556