How to use D365 F&O with blob storage

Hi All,

I hope I´m correct here, If not please let me know :wink:

We would like to know something around the funcionalitiy of Dynamics 365 F&O and the Blob Storage behind it.

Currently, we move some specific Invoices to the blob storage and copy this pdf files via batch job and power shell script to our local file server. This workes in prinicipial quite good, the problem is, that the used key to make this possible expires after 30 days. Does anybody know how we can extend this time or if it is possible to move those files directly to our file server without using the blob storage?

Thanks in advance fr your help.



I see that this question is already discussed in Dynamics Community forum and Sascha shared more information there than here.

Yes Sir, this is right. Hope this is okay? Didn´t know that botch plattfroms are connected.

They aren’t connected, but ignoring the previous discussion and starting it again from scratch would be waste of your time, and also time of anybody helping you. Sharing all information is the best for everyone.

Just a little push to bring this up again. Hope someone has an idea.

You told us nothing about your batch job, but the problem you have suggests that it’s designed in a wrong way. Why do you somewhere store SAS tokens for 30 days, instead of generating them when needed and using them immediately?

You could try a logic app. It authenticates differently and shouldn’t expire in 30 days.

The logic app could be triggered when a file is created in your container and then have it push to a file share using the on-premises data gateway.

Sry for my late reply and thank you for your answer.
I don´t know how we can generate the token immediately when we download the file because we run the code in a powershell script.

Puh, you got me with this answer. I know we are using logic app in the Azure environment but how to use it for this specific case, no idea. Can you give my a hint that I can forward to our partner who supporting us with this? Thanks in advance.

Instead of trying to use an old (and often expired) link to a blob, ask F&O to give you a link (or the data directly). As you can call Azure Storage API from PowerShell, you can call F&O web services as well.

I would tell them what I indicated above. a logic app that’s triggered when a new file is created in the blob container. you’ll have to install the on-premises data gateway on your on-premise server first, but providing it has proper access to the file share, you can use it as a destination in logic apps.

We had a call today with our D365 Partner and he told me that the solution using a logic app and an on premise gateway would not run properly because it would still need any kind of SAS Token which is again not set to never expire. Can you confirm this or is this not correct?

You’re mixing two things together.
The on-premises data gateway for Azure Logic Apps allow you to store data on your server.
If you use and old, experied link to access a blob, it’s a completely different problem. What Jacob suggested was copying the file immediately when it gets created, which would take seconds or maybe minutes, but definitely not 30 days. I also suggested a solution that can give you a new token at any time, maybe a year later.

This was just the information we got from our partner who is managing the whole system and the Azure environment. I´m only a normal System Administrator and not that much deeply involved into this topic.

To be honest, I wals also wondering why a SAS Token should be used from the gateway and/or the logic app it self.