The Base64 Code is wrong (for an ASCII input)! For an Unicode input the output is ok.
The Base64 encoder encodes a memory buffer. It doesn't know whether this is ascii or unicode encoded. The memory buffer in your example is a PureBasic string. PureBasic strings are ascii in ascii mode and unicode in unicode mode. Thus what happens is only natural.
Your example isn't the right way. You poke ascii data into a unicode string, which should give you problems at some point. Indeed, you already have a problem because StringByteLength(Work) doesn't return the correct length of ascii data within a unicode string, as this example shows:
Code: Select all
Work.s = Space(1024)
PokeS(@Work, "12345", -1, #PB_Ascii)
Debug StringByteLength(Work) ; wrong in unicode mode
When you have string data that could be of a different format than normal PB strings you should use allocated memory, like this:
Code: Select all
Example.s = "This is a test string!"
*Work = AllocateMemory(1024)
*Encoded = AllocateMemory(1024)
PokeS(*Work,Example,Len(Example), #PB_Ascii)
Debug Base64Encoder(*Work, MemoryStringLength(*Work, #PB_Ascii), *Encoded, MemorySize(*Encoded))
Debug PeekS(*Encoded, -1, #PB_Ascii)
FreeMemory(*Work) ; important
FreeMemory(*Encoded) ; important