I find a light problem of execution with the Point command, and I did not find a type convention in the doc.
In the doc, we have this :
Code: Select all
Syntax :
Result = Point(X, Y)
Code: Select all
Result.I = Point(X.I, Y.I)
On x86 :
Code: Select all
Result.L = Point(X.L, Y.L)
Code: Select all
Result.Q = Point(X.Q, Y.Q)
So, the problem I met is the next one :
On X64, I tried to store an alpha color value from the Point() result to an array of 32-bits longs, which are ever signed.
So, lets see the binary process :
Code: Select all
Point(X, Y) hex result :
0x00000000FFAABBCCh
Result of Longs array storage :
0xFFAABBCCh
Result comparison between a other pixel being the same alpha color and the value on the Longs array :
False (!)
Analysis of this comparison :
Is
0x00000000FFAABBCCh
Equ to
0x800000007FAABBCCh
?
Because, when I did not find a result type in the doc, after have made my beautiful light type bug, I was wondering myself what good change to choose to stay retro-compatible from x64 code to x86.
My choice on X64 is change my array type from Long to Integer, but I am not sure, seeing no information or rule about the default type in the doc...