How to import the data by using excel file path name in D365 in X++
Any one is having code for importing the data thorugh excel to D365FO, directly giving the file name or file path name coding level.
without opening the dailog for selecting of the file.
Do you mean that you’ll give the web application (running in an Azure data center) a file path on your local computer and it’ll connect to your machine and read files on your disk? If so, it’s not going to work. Look at Recurring Integration Scheduler instead. It’s a locally running application that can read your files and push them to D365FO.
Or you can put the file somewhere where it’s reachable from D365FO and either write code to read it, or use an integration platform (such as Logic Apps) to do it for you. For instance, you can create a Logic App reading the file from OneDrive, parsing it and importing data to D365FO via D365FO connector.
Thanks for your reply,
Actually My requirement is based on the file selection in the dialog, it should exceute in the batch process (sys operation) Asynchronous process
for that iam tesing giving the filename/filepath in runnable class in D365 our test environment in local folders. file name passing in the runnable class
that file name it has to pass on the parameter and call some other method based on parameters.and need to import the data.
please give me the suggestion for this requirement.
What do you mean by local folders? If folders on your local machine, I explained above why it’s not possible. If you mean folders on cloud services running D365FO, it’s not possible either, because you don’t have access to them (in production environments).
If forget your technical design for a moment and tell us what you’re trying to achieve from business point of view, we should be able to suggest a solution that is actually possible.
Thanks for your reply,
My requirement is, i have to give the file as input parameter it should execute in batch process sysoperation Aynchronous Mode
i have used filepath edt in parameter in contract class and calling into service class, but browse option is not coming for that iam testing these scenarios and giving the file path and testing the scenarios.
can you please suggest me how we can get the browse option for the sys operation fraework for data import using Excel upload
No, what you’re describing is a suggested implementation, not a business requirement.
A business requirement may be, say, to import exchange rates every day. Things like file names and frameworks are mere implementation details. They’re not the business goal.
You should first understand what business users want to achieve, and only then design a technical solution. If you don’t know it, then you must ask your users or functional constants.
You can easily let users to upload a whole file and process it, but - as I explained above - you can’t take a mere file path and read local files in a batch.
My requirement Is, need to upload the approved timesheet using excel upload in onpremise environment, it is taking too much time for excel upload
for that we were trying for the batch process. sys operation frame work, but browse option is not coming for this.
we were able to upload the file using excel upload , i need one parameter for browse option for selecting the file using sys operation framework
What do you mean by “Excel upload”? (Please realize that if you don’t give us enough information and we have to interrogate you, it’ll take more time for you to get an answer).
If you mean uploading data directly from Excel, consider using data management instead.
If you mean using data management, then clarify whether the slow part if just the upload or data processing. By the way, the dialog when you upload files is an example of a file upload control.
If it’s the processing, then you probably didn’t realize that it’s already done asynchronously.
If the problem is during file upload, please give us more details. Depending on what you’re doing, it might related more to mapping generation than to upload. Unless you file is extremely big, the upload itself shouldn’t take too long.
Only the last point would be related to file paths and local files that you originally asked about (and the Recurring Integration Scheduler - mentioned in my very first reply - would be a solution). The other situations aren’t about accessing local files at all - you either manually upload a file to Azure or you push data through OData services.
Thanks for your reply again,
Business requirement in detail: customer need option for uploading the timesheet data
in excel below are the columns
Project Legal Entity,
Line of property
customer using third party application to fill the Timesheet, they are using D365 for project managemnet for billing and invoices.
for that upload option for D365FO
We tried datamanagement as per they given input columns it is not possible inporting the data.
we have written the logic for excel upload for inserting the data in timesheet tables( TStimesheetTable, line,Lineweek, trans) tables
here vlidations are more and taking too much time.
for that we are trying in excel upload using batch process.
above i.e is the my requiremnet,
Can you elaborate your statement “per they given input columns it is not possible inporting the data”. Does it mean that there no standard entity meat your requirements? If so, why don’t you implement a custom entity, rather than trying to create a new import framework?
Also, now you’re saying that the problem is with validations, not with upload (as you claimed in the previous reply). If it’s the case, you can upload the file through UI (no need to think about file paths and things like that), because the performance problem doesn’t lie here, and process the file content asynchronously (the processing includes validations, which is the problematic part). That’s exactly what data management does. Therefore you can simply use data management, or you can implement something working in a similar way.