- Mark our core test-fixtures jar as part of the "cctest", rather than
a separate library. I'm fairly sure this was actually using the
classpath version of CC rather than the legacyClasspath version!
- Add a new "testMinecraftLibrary" configuration, instead of trying to
infer it from the classpath. We have to jump through some hoops to
avoid having multiple versions of a library on the classpath at once,
but it's not too bad.
I'm working on a patch to bsl which might allow us to kill of
legacyClasspath instead. Please, anything is better than this.
Forge doesn't run client-side commands from sendUnsignedCommand, so we
still require a mixin there.
We do need to change the command name, as Fabric doesn't properly merge
the two command trees.
Rather than mixing-in to CachedOutput, we just wrap our DataProviders to
use a custom CachedOutput which reformats the JSON before writing. This
allows us to drop mixins for common+non-client code.
Originally we exposed a single registerTurtleUpgradeModellermethod which
could be called from both Fabric (during a mod's client init) and Forge
(during FMLClientSetupEvent).
This was fine until we allowed upgrades to specify model dependencies,
which would then automatically loaded, as this means model loading now
depends on upgrade modellers being loaded. Unknown to me, this is not
guaranteed to be the case on Forge - mod setup happens at the same time
as resource reloading!
Unfortunately there's not really a salvageable way of fixing this with
the current API. Forge now uses a registration event-based system,
meaning we can guarantee all modellers are loaded before models are
baked.
Historically we used Forge's SimpleChannel methods (and
PacketDistributor) to send the packets to the client. However, we don't
need to do that - it is sufficient to convert it to a vanilla packet,
and send the packet ourselves.
Given we need to do this on Fabric, it makes sense to do this on Forge
as well. This allows us to unify (and thus simplify) a lot of how packet
sending works.
At the same time, we also remove the handling of speaker audio during
decoding. We originally did this to avoid the additional copy of audio
data. However, this doesn't work on 1.20.4 (as packets aren't
encoded/decoded on singleplayer), so it makes sense to do this
Correctly(TM).
This also allows us to get rid of ClientNetworkContext.get(). We do
still need to service load this class (as Forge's networking isn't split
up in the same way Fabric's is), but we'll be able to drop that in
1.20.4.
Finally, we move the record playing code from ClientNetworkContext to
ClientPlatformHelper. This means the network context no longer needs to
be platform-specific!
After embarrassing, let's do some proper work.
Rather than passing the level and position each time we call
ComponentAccess.get(), we now pass them at construction time (in the
form of the BE). This makes the consuming code a little cleaner, and is
required for the NeoForge changes in 1.20.4.
Everything old is new again!
CC's network message implementation has gone through several iterations:
- Originally network messages were implemented with a single class,
which held an packet id/type and and opaque blobs of data (as
string/int/byte/NBT arrays), and a big switch statement to decode and
process this data.
- In 42d3901ee3, we split the messages
into different classes all inheriting from NetworkMessage - this bit
we've stuck with ever since.
Each packet had a `getId(): int` method, which returned the
discriminator for this packet.
- However, getId() was only used when registering the packet, not when
sending, and so in ce0685c31f we
removed it, just passing in a constant integer at registration
instead.
- In 53abe5e56e, we made some relatively
minor changes to make the code more multi-loader/split-source
friendly. However, this meant when we finally came to add Fabric
support (8152f19b6e), we had to
re-implement a lot of Forge's network code.
In 1.20.4, Forge moves to a system much closer to Fabric's (and indeed,
Minecraft's own CustomPacketPayload), and so it makes sense to adapt to
that now. As such, we:
- Add a new MessageType interface. This is implemented by the
loader-specific modules, and holds whatever information is needed to
register the packet (e.g. discriminator, reader function).
- Each NetworkMessage now has a type(): MessageType<?> function. This
is used by the Fabric networking code (and for NeoForge's on 1.20.4)
instead of a class lookup.
- NetworkMessages now creates/stores these MessageType<T>s (much like
we'd do for registries), and provides getters for the
clientbound/serverbound messages. Mod initialisers then call these
getters to register packets.
- For Forge, this is relatively unchanged. For Fabric, we now
`FabricPacket`s.
While ComputerFamily is still useful, there's definitely some places
where it adds an extra layer of indirection. This commit attempts to
clean up some places where we no longer need it.
- Remove ComputerFamily from AbstractComputerBlock. The only place this
was needed is in TurtleBlock, and that can be replaced with normal
Minecraft explosion resistence!
- Pass in the fuel limit to the turtle block entity, rather than
deriving it from current family.
- The turtle BERs now derive their model from the turtle's item, rather
than the turtle's family.
- When creating upgrade/overlay recipes, use the item's name, rather
than {pocket,turtle}_family. This means we can drop getFamily() from
IComputerItem (it is still needed on to handle the UI).
- We replace IComputerItem.withFamily with a method to change to a
different item of the same type. ComputerUpgradeRecipe no longer
takes a family, and instead just uses the result's item.
- Computer blocks now use the normal Block.asItem() to find their
corresponding item, rather than looking it up via family.
The above means we can remove all the family-based XyzItem.create(...)
methods, which have always felt a little ugly.
We still need ComputerFamily for a couple of things:
- Permission checks for command computers.
- Checks for mouse/colour support in ServerComputer.
- UI textures.
- Add a check to ensure declared dependencies in the :core project, and
those inherited from Minecraft are the same.
- Compute the next Cobalt version, rather than specifying it manually.
- Add the gradle versions plugin (and version catalog update), and
update some versions.
Previously we prevented our published full jar depending on any of the
other projects by excluding the whole cc.tweaked jar. However, as Cobalt
also now lives in that group, this meant we were missing the Cobalt
dependency.
Rather than specifying a wildcard, we now exclude the dependencies when
adding them to the project.
This commit adds abstract classes to describe the interface for our
mod-loader-specific generic peripherals (inventories, fluid storage,
item storage).
This offers several advantages:
- Javadoc to illuaminate conversion no longer needs the Forge project
(just core and common).
- Ensures we have a consistent interface between Forge and Fabric.
Note, this does /not/ implement fluid or energy storage for Fabric. We
probably could do fluid without issue, but not something worth doing
right now.
This adds a new "java_allocation" metric, which tracks the number of
bytes allocated while executing the computer (as measured by Java). This
is not an 100% reliable number, but hopefully gives some insight into
what computers are doing.
In practice, we're never going to change this to true by default. The
old Tekkit Legends pack enabled this[^1], and that caused a lot of
problems, though admittedly back in 2016 so things might be better now.
If people do want this functionality, it should be fairly easy to
replicate with a datapack, adding a file to rom/autorun.
[^1]: See https://www.computercraft.info/forums2/index.php?/topic/27663-
Hate that I remember this, why is this still in my brain?
Allows registering arbitrary block lookup functions instead of a
platform-specific capability. This is roughly what Fabric did before,
but generalised to also take an invalidation callback.
This callback is a little nasty - it needs to be a NonNullableConsumer
on Forge, but that class isn't available on Fabric. For now, we make the
lookup function (and thus the generic peripheral provider) generic on
some <T extends Runnable> type, then specialise that on the Forge side.
Hopefully we can clean this up when NeoForge reworks capabilities.
Previously we had the invariant that if we had a server monitor, we also
had a terminal. When a monitor shrank into a place, we deleted the
monitor, and then recreated it when a peripheral was requested.
As of ab785a0906 this has changed
slightly, and we now just delete the terminal (keeping the ServerMonitor
around). However, we didn't adjust the peripheral code accordingly,
meaning we didn't recreate the /terminal/ when a peripheral was
requested.
The fix for this is very simple - most of the rest of this commit is
some additional code for ensuring monitor invariants hold, so we can
write tests with a little more confidence.
I'm not 100% sold on this approach. It's tricky having a double layer of
nullable state (ServerMonitor, and then the terminal). However, I think
this is reasonable - the ServerMonitor is a reference to the multiblock,
and the Terminal is part of the multiblock's state.
Even after all the refactors, monitor code is still nastier than I'd
like :/.
Fixes#1608
We can't use FriendlyByte.readCollection to read to a
pre-allocated/array-backed NonNullList, as that doesn't implement
List.add. Instead, we just need to do a normal loop.
We add a couple of tests to round-trip our recipe specs. Unfortunately
we can't test the recipes themselves as our own registries aren't set
up, so this'll have to do for now.
This attempts to reduce some duplication in recipe serialisation (and
deserialisation) by moving the structure of a recipe (group, category,
ingredients, result) into seprate types.
- Add ShapedRecipeSpec and ShapelessRecipeSpec, which store the core
properties of shaped and shapeless recipes. There's a couple of
additional classes here for handling some of the other shared or
complex logic.
- These classes are now used by two new Custom{Shaped,Shapeless}Recipe
classes, which are (mostly) equivalent to Minecraft's
shaped/shapeless recipes, just with support for nbt in results.
- All the other similar recipes now inherit from these base classes,
which allows us to reuse a lot of this serialisation code. Alas, the
total code size has still gone up - maybe there's too much
abstraction here :).
- Mostly unrelated, but fix the skull recipes using the wrong UUID
format.
This allows us to remove our mixin for nbt in recipes (as we just use
our custom recipe now) and simplify serialisation a bit - hopefully
making the switch to codecs a little easier.
Rather than having a mess of lambdas, we now move the bulk of the
implemetation to their own methods. The lambdas now just do argument
extraction - it's all stringly typed, so good to keep that with the
argument definition.
This also removes a couple of exception keys (and thus their translation
keys) as we no longer use them.
I removed this in aa0d544bba, way back in
late 2021. Looks like it's been updating in the meantime and I hadn't
noticed, so add it back.
I've simplified the code a little bit, to make use of our new capability
helpers, but otherwise it's almost exactly the same :D.
- Split buttons.png into individual textures.
- Split corners_xyz.png into the following:
- borders_xyz.png: A nine-sliced texture of the computer borders.
- pocket_bottom_xyz.png: A horizontally 3-sliced texture of the
bottom part of a pocket computer.
- sidebar_xyz.png: A vertically 3-sliced texture of the computer
sidebar.
While not splitting the sliced textures into smaller ones may seem a
little odd, it's consistent with what vanilla does in 1.20.2, and I
think will make editing them easier than juggling 9 textures.
I do want to make this more data-driven in the future, but that will
have to wait until the changes in 1.20.2.
This also adds a tools/update-resources.py program, which performs this
transformation on a given resource pack.
- Add a generic PermissionRegistry interface. This behaves similarly to
our ShaderMod interface, searching all providers until it finds a
compatible one.
We could just make this part of the platform code instead, but this
allows us to support multiple systems on Fabric, where things are
less standardised.
This interface behaves like a registry, rather than a straight
`getPermission(node, player)` method, as Forge requires us to list
our nodes up-front.
- Add Forge (using the built-in system) and Fabric (using
fabric-permissions-api) implementations of the above interface.
- Register permission nodes for our commands, and use those
instead. This does mean that the permissions check for the root
/computercraft command now requires enumerating all child
commands (and so potential does 7 permission lookups), but hopefully
this isn't too bad in practice.
- Remove UserLevel.OWNER - we never used this anywhere, and I can't
imagine we'll want to in the future.
- Remove some unused translation keys.
- Run tools/language.py to sort the current translations and remove the
aforementioned unused keys.
- Update turtle tool impostor recipes - these now include the tool NBT!
- Overhaul model loading to work with the new API. This allows for
using the emissive texture system in a more generic way, which is
nice!
- Convert some of our custom models to use Fabric's model hooks (i.e.
emitItemQuads). We don't make use of this right now, but might be
useful for rendering tools with enchantment glints.
Note this does /not/ change any of the turtle block entity rendering
code to use Fabric/Forge's model code. This will be a change we want
to make in the future.
- Some cleanup of our config API. This fixes us printing lots of
warnings when creating a new config file on Fabric (same bug also
occurs on Forge, but that's a loader problem).
- Fix a few warnings
We've supported resource conditions in the upgrade JSON for an age, but
don't expose it in our data generators at all.
Indeed, using these hooks is a bit of a pain to do in multi-loader
setups, as the JSON is different between the two loaders. We could
generate the JSON for all loaders at once, but it feels nicer to use
the per-loader APIs to add the conditions.
For now, we just support generating a single condition - whether a mod
is loaded not, via the requireMod(...) method.
We switched to Forge's loot modifier system in the 1.20 update, as
LootTable.addPool had been removed. Turns out this was by accident, and
so we switch back to the previous implementation, as it's much simpler
and efficient.
Turtle tools now accept two additional JSON fields
- allowEnchantments: Whether items with enchantments (or any
non-standard NBT) can be equipped.
- consumesDurability: Whether durability will be consumed. This can be
"never" (the current and default behaviour), "always", and
"when_enchanted".
Closes#1501.
This removes a tiny bit of duplication (at the cost of mode code), but
makes the interface more intuitive, as there's no bouncing between
getCombination -> cache -> buildModel.
It turns out we don't document the "port" option anywhere, so probably
worth doing a bit of an overhaul here.
- Expand the top-level HTTP rules comment, clarifying how things are
matched and describing each field.
- Improve the comments on the default HTTP rule. We now also describe
the $private rule and its motivation.
- Don't drop/ignore invalid rules. This gets written back to the
original config file, so is very annoying! Instead we now log an
error and convert the rule into a "deny all" rule, which should make
it obvious something is wrong.
- Update to Loom 1.2 and FG 6.0. ForgeGradle has changed how it
generates the runXyz tasks, which makes running our tests much
harder. I've raised an issue upstream, but for now we do some nasty
poking of internals.
- Fix Sodium/Iris tests. Loom 1.1 changed how remapped configurations
are generated - we create a dummy source set and associate the
remapped configuration with that. All nasty stuff.
- Publish the common library. I'm not a fan of this, but given how much
internals I'm poking elsewhere, should probably get off my high
horse.
- Add renderdoc support to the client gametests, enabled with
-Prenderdoc.
- Move several interfaces out of `d00.computercraft.core.asm` into a
new `aethods` package. It may make sense to expose this to the
public API in a future commit (possibly part of #1462).
- Add a new MethodSupplier<T> interface, which provides methods to
iterate over all methods exported by an object (either directly, or
including those from ObjectSources).
This interface's concrete implementation (asm.MethodSupplierImpl),
uses Generators and IntCaches as before - we can now make that all
package-private though, which is nice!
- Make the LuaMethod and PeripheralMethod MethodSupplier local to the
ComputerContext. This currently has no effect (the underlying
Generator is still global), but eventually we'll make GenericMethods
non-global, which unlocks the door for #1382.
- Update everything to use this new interface. This is mostly pretty
sensible, but is a little uglier on the MC side (especially in
generic peripherals), as we need to access the global ServerContext.
- Remove SidedGenericPeripheral (we never used this!), adding the
functionality to GenericPeripheral directly. This is just used on the
Fabric side for now, but might make sense with Forge too.
- Move GenericPeripheralBuilder into the common project - this is
identical between the two projects!
- GenericPeripheralBuilder now generates a list of methods internally,
rather than being passed the methods.
- Add a tiny bit of documentation.