Mojo struct
PrefetchOptions
@register_passable(trivial)
struct PrefetchOptions
Collection of configuration parameters for a prefetch intrinsic call.
The op configuration follows similar interface as LLVM intrinsic prefetch
op, with a "locality" attribute that specifies the level of temporal locality
in the application, that is, how soon would the same data be visited again.
Possible locality values are: NONE
, LOW
, MEDIUM
, and HIGH
.
The op also takes a "cache tag" attribute giving hints on how the
prefetched data will be used. Possible tags are: ReadICache
, ReadDCache
and WriteDCache
.
Note: the actual behavior of the prefetch op and concrete interpretation of these attributes are target-dependent.
Fields
- rw (
PrefetchRW
): Indicates prefetching for read or write. - locality (
PrefetchLocality
): Indicates locality level. - cache (
PrefetchCache
): Indicates i-cache or d-cache prefetching.
Implemented traits
AnyType
,
UnknownDestructibility
Methods
__init__
__init__(out self)
Constructs an instance of PrefetchOptions with default params.
for_read
for_read(self) -> Self
Sets the prefetch purpose to read.
Returns:
The updated prefetch parameter.
for_write
for_write(self) -> Self
Sets the prefetch purpose to write.
Returns:
The updated prefetch parameter.
no_locality
no_locality(self) -> Self
Sets the prefetch locality to none.
Returns:
The updated prefetch parameter.
low_locality
low_locality(self) -> Self
Sets the prefetch locality to low.
Returns:
The updated prefetch parameter.
medium_locality
medium_locality(self) -> Self
Sets the prefetch locality to medium.
Returns:
The updated prefetch parameter.
high_locality
high_locality(self) -> Self
Sets the prefetch locality to high.
Returns:
The updated prefetch parameter.
to_data_cache
to_data_cache(self) -> Self
Sets the prefetch target to data cache.
Returns:
The updated prefetch parameter.
to_instruction_cache
to_instruction_cache(self) -> Self
Sets the prefetch target to instruction cache.
Returns:
The updated prefetch parameter.
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!