JavaScript is disabled, refresh for a better experience. ambee/giterated

ambee/giterated

Git repository hosting, collaboration, and discovery for the Fediverse.

no clue what this is

Amber - ⁨1⁩ year ago

parent: tbd commit: ⁨7889bf6

⁨giterated-runtime/giterated-abi/src/lib.rs⁩ - ⁨18919⁩ bytes
Raw
1 //! Giterated ABI
2 //! # ABI
3 //!
4 //! ## Value ABI
5 //!
6 //! At its core, the Giterated Runtime uses the `extern "C"` ABI. What that means is likely platform specific, and doesn't matter.
7 //! You are intended to compile the Giterated Runtime and Plugins for your local machine, all with a similar idea of what
8 //! your "local machine" is.
9 //!
10 //! Values are passed using the `FFI` type. There are four categories of value that the `FFI` type enables you to pass:
11 //!
12 //! | `FFI` Type Category | Placed Backing? | Owned? |
13 //! |---------------------|-----------------|--------|
14 //! | Slice | Heap/Stack | No |
15 //! | Referenced Slice | Stack | No |
16 //! | Referenced Value | No | No |
17 //! | Owned Value | Heap | Yes |
18 //!
19 //! For an FFI type to have a "placed backing" is for it to have some data structure beyond the data it represents, placed
20 //! somewhere in memory. Some types only require stack placement while some offer both stack and heap placement.
21 //!
22 //! Stack-placed values can be shared by `PinnedRef` and `PinnedMut`, and thus can only be owned by the caller.
23 //!
24 //! Heap-placed values can be shared by `Owned`, `PinnedRef`, and `PinnedMut`. They can be owned by any one consumer,
25 //! When the handle with ownership is `Drop`'d by the sole consumer, it will free the object using the associated `Drop` callback.
26 //!
27 //! ### Safety Intents
28 //!
29 //! This API is designed to simplify interaction with FFI values, and provide a static ABI for those values to be passed. It
30 //! is key to enabling ownership across FFI while ensuring associated dropping and allocation freeing logic is ran.
31 //!
32 //! The contract the developer has to follow is made simpler by this system, and it allows for generic code to be written to
33 //! interact with FFI-given values and pass values using FFI.
34 //!
35 //! ### Stability Guarantees
36 //!
37 //! There are no plans to guarantee stability until 1.0.0. At that point you can expect the ABI to remain stable until the major version
38 //! is incremented again. There will be an appropriate deprecation process and changeover period.
39 //!
40 //! ### Memory Representation
41 //!
42 //! Please check out the source code, sorry if you needed that from the docs!
43 //!
44 //! ## Object, Operation, Setting, Value, Plugin, and Runtime ABIs
45 //!
46 //! The Giterated Runtime uses vtables to accomplish the goal of ensuring maximum compatibility. For every object that is shared
47 //! between plugins, a vtable is used to allow each plugin to provide their own code for interacting with the object.
48 //!
49 //! When objects switch "runtime domains" (e.g. host -> plugin, plugin -> plugin, plugin -> host), their vtable is swapped out
50 //! for the new runtime domain's own vtables.
51 //!
52 //! ### Untyped "Objects" (see above header for list)
53 //!
54 //! Untyped objects, in memory, are represented by a data pointer and a vtable pointer. Exactly like Rust traits. However, to
55 //! prevent small compilation differences and other random garbage from making the interface not perfectly compatible we use
56 //! the local plugin's idea of the vtable for the object at all times. An object that the plugin does not have a vtable for cannot
57 //! be relevant to the plugin.
58 //!
59 //! It is important that the object's base representation in memory remain unchanged between major versions, but the vtables that provide methods for
60 //! that object may be grown. The methods that operate on that object may be changed in an non-breaking fashion, and bugs can be
61 //! fixed.
62 //!
63 //! ## Futures ABI
64 //!
65 //! The Giterated Runtime has an async runtime that allows for futures to be shared and awaited across FFI boundaries while only
66 //! executing the future within the context of the Plugin who is running the underlying future.
67 //!
68 //! Futures are spawned onto the `RuntimeState` with the `RuntimeFuturesExt` trait. This takes a Rust future, boxes it, and
69 //! provides a `RuntimeFuture` handle that can be used to drive the underlying Rust future locally. The `RuntimeFuture` handle
70 //! is thread-safe and can be shared with the callee and `.await`'d directly like any other future.
71 //!
72 //! ### RuntimeFuture
73 //!
74 //! The `RuntimeFuture` mixes a vtable with data to allow for any caller to drive a spawned future. It contains:
75 //!
76 //! - A `poll_fn` which is used to poll the future for `Ready`-ness
77 //! - A `wake_fn` which is used to wake the callee to poll for (expected) `Ready`-ness, it is populated when the `RuntimeFuture` is `await`'d.
78 //!
79 //! When the `RuntimeFuture` is polled, it causes the inner future to also be polled. We provide the inner future with a waker
80 //! that triggers the `RuntimeFuture`'s waker so it is polled again. Breaking character to point out how freaking cool that is.
81 //!
82 //! `RuntimeFuture`s drop the associated inner future as they drop.
83
84 pub mod callback;
85 mod future;
86 pub mod heap;
87 pub mod model_impl;
88 pub mod operation;
89 pub mod plugin;
90 pub mod result;
91 pub mod state;
92 pub mod vtable;
93 use abi_backing::{HeapValueBacking, SliceBacking};
94 pub use future::{FfiFuture, RuntimeFuturePoll};
95 use heap::HeapPlacable;
96 use prelude::value_ex::FfiValueUntyped;
97
98 use std::{
99 marker::PhantomData,
100 mem::{transmute, MaybeUninit},
101 ops::{Deref, DerefMut},
102 };
103
104 use abi_types::{Slice, SliceMut, SliceRef, Value, ValueMut, ValueRef};
105 use guards::{HeapPinnedSlice, HeapPinnedValue, StackPinnedSlice, StackPinnedValue};
106
107 #[doc(hidden)]
108 pub mod prelude {
109 pub use crate::Ffi;
110 pub use crate::StackPinned;
111 pub use crate::*;
112 pub use crate::{FfiSlice, FfiSliceRef, FfiValue, FfiValueRef};
113 }
114
115 /// Slice Reference
116 /// Heap or Stack Placed
117 pub type FfiSliceRef<T> = Ffi<T, SliceRef>;
118
119 /// Mutable Slice Reference
120 /// Heap or Stack Placed
121 pub type FfiSliceMut<T> = Ffi<T, SliceMut>;
122
123 /// Value Reference
124 /// Heap or Stack Placed
125 pub type FfiValueRef<T> = Ffi<T, ValueRef>;
126
127 /// Mutable Value Reference
128 /// Heap or Stack Placed
129 pub type FfiValueMut<T> = Ffi<T, ValueMut>;
130
131 /// Owned Value
132 /// Heap Placed
133 pub type FfiValue<T> = Ffi<T, Value>;
134
135 /// Owned Slice
136 /// Heap Placed
137 pub type FfiSlice<T> = Ffi<T, Slice>;
138
139 pub mod value_ex {
140 use crate::{abi_types::Value, Ffi};
141
142 pub type FfiValueUntyped = Ffi<(), Value>;
143 pub type FfiValueRefUntyped = Ffi<(), Value>;
144 }
145
146 /// A value passed over FFI, following the Giterated ABI.
147 ///
148 /// The function of the [`Ffi`] type is to take an arbitrary pointer and send it over FFI.
149 /// Both the caller and callee **must** have the same understanding of what the pointer represents.
150 /// The [`Ffi`] type is also used to encode ownership information.
151 ///
152 /// # The Pointer
153 /// The pointer contained within the [`Ffi`] is transmuted based on the provided `ABI` on the
154 /// [`Ffi`] type signature.
155 #[repr(transparent)]
156 pub struct Ffi<T: ?Sized, ABI> {
157 inner: *const (),
158 _type_marker: PhantomData<T>,
159 _abi_marker: PhantomData<ABI>,
160 }
161
162 impl<T> FfiSlice<T> {
163 #[inline(always)]
164 pub fn pin(&self) -> HeapPinnedSlice<'_, T> {
165 unsafe { HeapPinnedSlice::from_raw(self) }
166 }
167 }
168
169 impl<T> Deref for FfiSlice<T> {
170 type Target = [T];
171
172 #[inline(always)]
173 fn deref(&self) -> &Self::Target {
174 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
175 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
176
177 unsafe {
178 core::slice::from_raw_parts(
179 backing.slice as *mut T,
180 usize::try_from(backing.count).unwrap_unchecked(),
181 )
182 }
183 }
184 }
185 impl<T> DerefMut for FfiSlice<T> {
186 #[inline(always)]
187 fn deref_mut(&mut self) -> &mut Self::Target {
188 let inner: *mut SliceBacking<[T]> = unsafe { transmute(self.inner) };
189 let backing = unsafe { inner.as_mut().unwrap_unchecked() };
190
191 unsafe {
192 core::slice::from_raw_parts_mut(
193 backing.slice as *mut T,
194 usize::try_from(backing.count).unwrap_unchecked(),
195 )
196 }
197 }
198 }
199
200 impl<T: ?Sized> FfiSliceRef<T> {
201 pub fn static_ref(source: &'static T) -> FfiSliceRef<T> {
202 todo!()
203 }
204 }
205
206 impl<T> Deref for FfiSliceRef<[T]> {
207 type Target = [T];
208
209 #[inline(always)]
210 fn deref(&self) -> &Self::Target {
211 let inner: *const SliceBacking<[T]> = unsafe { transmute(self.inner) };
212
213 let backing = unsafe { inner.as_ref().unwrap_unchecked() };
214
215 unsafe {
216 core::slice::from_raw_parts(
217 backing.slice as *const T,
218 usize::try_from(backing.count).unwrap_unchecked(),
219 )
220 }
221 }
222 }
223
224 impl<T> FfiValueRef<T> {}
225
226 impl<T> Deref for FfiValueRef<T> {
227 type Target = T;
228
229 #[inline(always)]
230 fn deref(&self) -> &Self::Target {
231 let inner: *const T = unsafe { transmute(self.inner) };
232
233 match unsafe { inner.as_ref() } {
234 Some(val) => val,
235 _ => unreachable!(),
236 }
237 }
238 }
239
240 impl<T> FfiValue<T> {
241 pub unsafe fn from_raw_ptr(ptr: *mut T) -> FfiValue<T> {
242 todo!()
243 }
244
245 pub unsafe fn ptr(&self) -> *const T {
246 self.inner as *const T
247 }
248 }
249
250 impl<T> Deref for FfiValueMut<T> {
251 type Target = T;
252
253 fn deref(&self) -> &Self::Target {
254 let inner: *mut T = unsafe { transmute(self.inner) };
255
256 unsafe { inner.as_ref().unwrap_unchecked() }
257 }
258 }
259 impl<T> DerefMut for FfiValueMut<T> {
260 fn deref_mut(&mut self) -> &mut Self::Target {
261 let inner: *mut T = unsafe { transmute(self.inner) };
262
263 unsafe { inner.as_mut().unwrap_unchecked() }
264 }
265 }
266
267 impl<T> std::fmt::Display for FfiValueRef<T>
268 where
269 T: std::fmt::Display,
270 {
271 fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
272 unsafe { (self.inner as *const T).as_ref().unwrap() }.fmt(f)
273 }
274 }
275
276 impl<T> FfiValue<T> {
277 pub fn new(value: T) -> Self {
278 let value = Box::new(HeapValueBacking {
279 value,
280 drop_fn: <T as HeapPlacable>::free,
281 });
282
283 FfiValue {
284 inner: Box::into_raw(value) as _,
285 _type_marker: PhantomData,
286 _abi_marker: PhantomData,
287 }
288 }
289
290 pub fn erase_type(self) -> FfiValueUntyped {
291 unsafe { transmute(self) }
292 }
293
294 pub fn pin(&self) -> HeapPinnedValue<'_, T> {
295 unsafe { HeapPinnedValue::from_raw(self) }
296 }
297
298 pub fn take(self) -> T {
299 // This all boils down to moving `T` out of the `FfiValue` and dropping the backing
300 // storage for said `FfiValue`. Despite the use of unsafe this is exactly how moving
301 // a value onto the stack works.
302
303 let inner = self.inner as *mut T;
304 let mut move_target: MaybeUninit<T> = MaybeUninit::zeroed();
305
306 unsafe { move_target.as_mut_ptr().copy_from(inner, 1) }
307
308 let inner_descriptor: *mut HeapValueBacking<T> = unsafe { transmute(self.inner) };
309
310 unsafe { (inner_descriptor.as_mut().unwrap_unchecked().drop_fn)(self, true) };
311
312 unsafe { move_target.assume_init() }
313 }
314 }
315
316 impl<T> Deref for FfiValue<T> {
317 type Target = T;
318
319 #[inline(always)]
320 fn deref(&self) -> &Self::Target {
321 let inner: *const T = unsafe { transmute(self.inner) };
322
323 unsafe { inner.as_ref().unwrap_unchecked() }
324 }
325 }
326 impl<T> DerefMut for FfiValue<T> {
327 #[inline(always)]
328 fn deref_mut(&mut self) -> &mut Self::Target {
329 let inner: *mut T = unsafe { transmute(self.inner) };
330
331 unsafe { inner.as_mut().unwrap_unchecked() }
332 }
333 }
334
335 mod abi_backing {
336 use std::{marker::PhantomData, mem::transmute};
337
338 use crate::{FfiSlice, FfiValue};
339
340 #[repr(C)]
341 pub struct HeapValueBacking<T: Sized> {
342 pub(super) value: T,
343 pub(super) drop_fn: unsafe extern "C" fn(value: FfiValue<T>, taken: bool),
344 }
345
346 pub struct SliceBacking<T: ?Sized> {
347 pub(crate) count: u64,
348 pub(crate) slice: *const (),
349 _marker: PhantomData<T>,
350 }
351
352 impl<T: ?Sized> SliceBacking<T> {
353 /// Creates a new slice backing from a raw slice pointer and a count.
354 ///
355 /// # SAFETY
356 ///
357 /// `slice` **must** refer to a valid slice, with a length greater than or equal to the
358 /// value provided as `count`.
359 #[inline(always)]
360 pub(crate) unsafe fn from_raw(count: u64, slice: *const ()) -> Self {
361 Self {
362 count,
363 slice,
364 _marker: PhantomData,
365 }
366 }
367
368 /// Creates a new slice backing from an [`FfiSlice`].
369 ///
370 /// # SAFETY
371 ///
372 /// The resultant [`SliceBacking`] **must not** outlive the backing [`FfiSlice`].
373 #[inline(always)]
374 pub(crate) unsafe fn from_heap(slice: &FfiSlice<T>) -> Self {
375 let heap_backing: *const SliceBacking<T> = unsafe { transmute(slice.inner) };
376
377 let heap_backing = unsafe { heap_backing.as_ref().unwrap_unchecked() };
378
379 Self {
380 count: heap_backing.count,
381 slice: heap_backing.slice,
382 _marker: PhantomData,
383 }
384 }
385 }
386 }
387
388 mod guards {
389 use std::marker::PhantomData;
390
391 use crate::{
392 abi_backing::SliceBacking, Ffi, FfiSlice, FfiSliceMut, FfiSliceRef, FfiValue, FfiValueMut,
393 FfiValueRef,
394 };
395
396 #[repr(transparent)]
397 pub struct StackPinnedSlice<'v, T: ?Sized> {
398 _lifetime: PhantomData<&'v T>,
399 slice: SliceBacking<T>,
400 }
401
402 impl<'v, T> StackPinnedSlice<'v, T> {
403 #[inline(always)]
404 pub fn grant_ref(&self) -> FfiSliceRef<T> {
405 FfiSliceRef {
406 inner: &self.slice as *const _ as *const (),
407 _type_marker: PhantomData,
408 _abi_marker: PhantomData,
409 }
410 }
411
412 #[inline(always)]
413 pub fn grant_mut(&mut self) -> FfiSliceMut<T> {
414 FfiSliceMut {
415 inner: &mut self.slice as *mut _ as *mut (),
416 _type_marker: PhantomData,
417 _abi_marker: PhantomData,
418 }
419 }
420 }
421
422 impl<'v, T> StackPinnedSlice<'v, T> {
423 /// Creates a stack pinned slice guard from a borrowed slice.
424 ///
425 /// # SAFETY
426 /// This function itself isn't "unsafe" but other code will become unsafe if the `slice`
427 /// becomes invalid or moves. You'd have to violate safety rules somewhere else to do that,
428 /// though.
429 #[inline(always)]
430 pub(crate) unsafe fn from_raw(slice: &'v [T]) -> StackPinnedSlice<'v, T> {
431 Self {
432 _lifetime: PhantomData,
433 slice: SliceBacking::from_raw(
434 u64::try_from(slice.len()).unwrap(),
435 slice.as_ptr() as *const (),
436 ),
437 }
438 }
439 }
440
441 pub struct StackPinnedValue<'v, T: ?Sized> {
442 value_ref: &'v T,
443 }
444
445 impl<'v, T: ?Sized> StackPinnedValue<'v, T> {
446 /// Grants a reference to the pinned value.
447 ///
448 /// # SAFETY
449 /// - The granted reference **must not** outlive the lifetime of `&self`.
450 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
451 #[inline(always)]
452 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
453 Ffi {
454 inner: self.value_ref as *const _ as *const (),
455 _type_marker: PhantomData,
456 _abi_marker: PhantomData,
457 }
458 }
459
460 /// Grants a reference to the pinned value.
461 ///
462 /// # SAFETY
463 /// - The granted reference **must not** outlive the lifetime of `&self`.
464 /// - There **must not** be a mutable reference created or mutable dereference performed during the lifetime of the [`FfiValueRef`].
465 #[inline(always)]
466 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
467 Ffi {
468 inner: self.value_ref as *const _ as *const (),
469 _type_marker: PhantomData,
470 _abi_marker: PhantomData,
471 }
472 }
473 }
474
475 impl<'v, T: ?Sized> StackPinnedValue<'v, T> {
476 #[inline(always)]
477 pub(crate) fn from_raw(value: &'v T) -> Self {
478 Self { value_ref: value }
479 }
480 }
481
482 pub struct HeapPinnedSlice<'v, T> {
483 _lifetime: PhantomData<&'v T>,
484 slice: SliceBacking<T>,
485 }
486
487 impl<'v, T> HeapPinnedSlice<'v, T> {
488 /// Creates a pin guard from a heap placed slice.
489 ///
490 /// # SAFETY
491 /// The `slice` **must not** be moved and **must not** have a mutable reference given during the lifetime
492 /// of the returned [`HeapPinnedSlice`] guard.
493 #[inline(always)]
494 pub(crate) unsafe fn from_raw(slice: &'v FfiSlice<T>) -> HeapPinnedSlice<'v, T> {
495 Self {
496 _lifetime: PhantomData,
497 slice: SliceBacking::from_heap(slice),
498 }
499 }
500
501 pub unsafe fn grant_ref(&self) -> FfiSliceRef<T> {
502 FfiSliceRef {
503 inner: &self.slice as *const _ as *const (),
504 _type_marker: PhantomData,
505 _abi_marker: PhantomData,
506 }
507 }
508
509 pub unsafe fn grant_mut(&mut self) -> FfiSliceMut<T> {
510 FfiSliceMut {
511 inner: &mut self.slice as *mut _ as *mut (),
512 _type_marker: PhantomData,
513 _abi_marker: PhantomData,
514 }
515 }
516 }
517
518 #[repr(transparent)]
519 pub struct HeapPinnedValue<'v, T> {
520 value: &'v FfiValue<T>,
521 }
522
523 impl<'v, T> HeapPinnedValue<'v, T> {
524 #[inline(always)]
525 pub(crate) unsafe fn from_raw(value: &'v FfiValue<T>) -> HeapPinnedValue<'v, T> {
526 Self { value }
527 }
528
529 #[inline(always)]
530 pub unsafe fn grant_ref(&self) -> FfiValueRef<T> {
531 FfiValueRef {
532 inner: self.value.inner,
533 _type_marker: PhantomData,
534 _abi_marker: PhantomData,
535 }
536 }
537
538 #[inline(always)]
539 pub unsafe fn grant_mut(&mut self) -> FfiValueMut<T> {
540 FfiValueMut {
541 inner: self.value.inner,
542 _type_marker: PhantomData,
543 _abi_marker: PhantomData,
544 }
545 }
546 }
547 }
548
549 mod abi_types {
550 pub struct Slice;
551
552 pub struct SliceRef;
553
554 pub struct SliceMut;
555
556 pub struct ValueRef;
557
558 pub struct ValueMut;
559
560 pub struct Value;
561 }
562
563 pub trait StackPinned<'p> {
564 type Pinned: ?Sized + 'p;
565
566 fn pin(&'p self) -> Self::Pinned;
567 }
568
569 impl<'p, T: 'p> StackPinned<'p> for [T] {
570 type Pinned = StackPinnedSlice<'p, T>;
571
572 #[inline(always)]
573 fn pin(&'p self) -> StackPinnedSlice<'p, T> {
574 unsafe { StackPinnedSlice::from_raw(self) }
575 }
576 }
577
578 impl<'p, T: 'p> StackPinned<'p> for T {
579 type Pinned = StackPinnedValue<'p, T>;
580
581 #[inline(always)]
582 fn pin(&'p self) -> Self::Pinned {
583 StackPinnedValue::from_raw(self)
584 }
585 }
586