Hi,
I'm one of the developers still working on The Realm, which as you know uses the SCI engine. The PMachine implementation we have was adapted from the 16-bit interpreter to work on a 32-bit machine. All of the SCI values are still 16-bit, but the interpreter itself is written in 32-bit x86 asm.
I recently reimplemented the PMachine in C++ and removed all of the inline asm from our codebase. Further, I widened the SCI values to 64-bit, including the memory manager handles. We often hit the 64K handle limit, so this removes that limitation from us.
In doing so, I believe I discovered a bug in the way that function parameters and temporaries are laid out on the SCI stack, and was wondering if this was noticed in Sierra's implementation, and was just curious to know if this caused any bugs in real Sierra games.
When a method is called, the argc is pushed on the stack, followed by the parameters. After entering the function, if the function uses any temporaries, there is a 'link' opcode at the beginning of the function that indicates how many temporaries the function uses. Now, there isn't ever any validation performed on the number of arguments passed to a function. This is a 'feature', in that all functions are 'vararg' functions, and the &rest feature can be used to forward function arguments along. However - if, say, you have a function that expects 4 parameters, but you only pass 2, this is sometimes expected - SCI scripts can check 'argc' and realize this, and behave differently. But you're also allowed to -assign- to parameters. What I believe might happen, if you have a function that takes 4 arguments, and also declares two temporaries, but the -caller- only passes you two arguments, is that the memory for the two non-passed arguments and the temporaries are at the same location on the stack, and assigning to a parameter will change the value of the temporary, and vice versa.
Just something that came up while I was debugging my rewrite and thought you all would find it interesting.