Working Projects

From PCGen Wiki
Revision as of 19:16, 10 November 2018 by Tom Parker (talk | contribs) (What is scope of work?)
Jump to: navigation, search

Token Rebuilds to Interface

What is the situation today?

Our loading tokens are using a form of commitment process (evaluate then commit or rollback) much like a SQL database.

This process today uses things like AbstractObjectContext (which inside it has an ObjectCommitStrategy) in order to do this level commitment process.

Why do we do this?

  1. Primarily, if there is a token that has a bad syntax, we don't want a partial data structure to get into the core code. This causes a tremendous amount of "what if" scenarios in our code that means folks (a) write a ton of error checking code (b) catch and silently consume errors
  2. We want to catch LST errors at load and report that ASAP to the data team. If they have to actually USE a token in order to make sure it is valid, that is a HUGE testing burden on the data team.

This commit process has been VERY valuable (we want to keep it!), but also challenging. There are a few limitations we want to get rid of:

  1. This forces us to only do things that ObjectCommitStrategy knows about. There is a huge centralized code dependency (high workload, fragile)
  2. This forces us to remember never to write to a specific object, but always through the Context (annoying, fragile)
  3. This forces a design where "everything is a CDOMObject", which creates a form of demi-god object (PlayerCharacter is still our "God Object" ;)

Where do I see this stuff?

  1. The tokens are in plugin.lsttokens
  2. The contexts are in pcgen.rules.context
  3. Some abstract items are in pcgen.rules.persistence.token

An actual example of how the token loading works is in pcgen.rules.persistence.TokenSupport.processClassTokens(...)... that will return success or failure and capture any messages in case of failure into a set of parsed messages.

The actual commit/rollback process can be seen in pcgen.persistence.lst.LstUtils.processToken(...)

What is the solution?

The best way to get around this is to "properly" use reflection, which was really not a thing comfortable for many developers in about 2006 when we started down the token path. Thankfully this can be abstracted into a library, so the reflection is well hidden from most users/developers.

The reflection code is StagingInfo, and it exits in our Base PCGen library (pcgen-base repository) in pcgen.base.proxy.

Is an example in the code today?

Yes. This staging is used for a limited number of tokens and you can find the process in pcgen.rules.persistence.TokenSupport.processInterfaceToken(...)

What is scope of work?

At this point, some folks may be thinking things like "Hey - I can have a getAlignment and putAlignment method on Deity.class!". That may be literally true, but there are a few considerations to that:

  1. One step at a time
  2. There are things outside the scope of this description that are also changing and make the first step as described here valuable and a dramatic step to completely eliminate the abstraction in CDOMObject to be potentially undoing other things we value

The scope of work here is to change indirect calls through an object context, e.g.: context.getObjectContext().put(deity, ObjectKey.ALIGNMENT, al); ...into direct calls on the object, e.g.: deity.put(ObjectKey.ALIGNMENT, al);

This directly meets requirements #1 and #2 above.

In order to do this a few things need to happen:

  • Read and write interfaces need to be defined, so they can be passed to a StagingInfo

It is my belief that the interface would work best right now if the scope was a single behavior. So the put and get just for ObjectKey.* would be processed separately.

  • The tokens that are using ObjectContext to use those methods need to call items directly on the object
  • A method needs to be defined that will appropriately limit usage of the object
    This may require a bit of explanation. Let's use plugin.lsttokens.deity.AlignToken. Today, this keys directly off the object, Deity, using CDOMPrimaryToken as the interface (the parent interface CDOMToken actually defines how TokenSupport will process that token).
    If we create interfaces ReadObjectKey and WriteObjectKey, those can be reused across all of the same types of methods, but it places us in a situation where we have lost the ability to enforce ALIGN:x only works on a Deity. So there is a balancing act here:
    We probably don't want to have interfaces directly for each type of object (readDeity and writeDeity), as that creates interface proliferation (and we want to get away from that level of specific behavior anyway)
    We probably want to think about how we can have tokens actually work on Dynamic objects - this relates to (A) above... the dynamic objects are all the same class, and thus a different validation method is necessary than one based on class or the interfaces implemented by that class. (see pcgen.cdom.inst.Dynamic)
  • It is likely that a new interface beyond CDOMPrimaryToken / CDOMInterfaceToken needs to be defined. It is possible CDOMInterfaceToken could be enhanced with a default method that always passes, but could be overridden with a method that would check for legality from #3 above in cases where it is necessary.

What is one possible path?

WARNING: Any mixed tokens that still use the context probably don't want to be converted, so two of these actions will be reverted at the end

  1. Take the import for ObjectKey out of ObjectCommitStrategy and find the errors
  2. Delete all the methods that are then errors from those methods being removed from the interface
  3. A whole bunch of tokens now have errors. Those need to be checked. If they are just a simple use of the interface then:
    1. changed to direct calls rather than through the objectcontext
    2. converted to the new load method (not CDOMPrimaryToken)
    3. Have the limit from #3 (limit where it is legal) added

If they are complicated (FavoredClass), then I'd leave as-is for the moment.

  1. Revert the changes made in #1 and #2

Lather, rinse, repeat for ObjectKey, IntegerKey, ListKey, StringKey, FactKey, FactSetKey

You can skip VariableKey, FormulaKey, MapKey for now.