What I have now works, but is very slow - I am wondering if there is a more modern solution please?
I have binary file data stored in a table (blob field) along with the original file name and other fields derived from the file name - I can manually import data using UploadIntoStream however I need to automate this process, calling a web service to load the data.
I have a CodeUnit which is passed the File Name (Text) and File Content (BigText) - this is called from a VB .Net application that uses a soap envelopment and does a Convert.ToBase64String on the file content.
The CodeUnit needs to convert the Base64 to back binary to store in the Blob field (the same way UploadIntoStream would store the data).
Why do you want all those binary files into your database?
I think it is better to save the binary files outside your database.
In the system app there a module called blob storage.
If you save it to a blob storage then you can retrieve it via the Blob storage module inside bc.
In this way your database keep small and it is very performed.
This code first creates an outstream for the blob field, then uses the Text Encoding functions to convert the Base64 text directly to binary data, and finally writes the binary data to the blob field’s outstream.
Using this method, you may see an improvement in the runtime of your CodeUnit.