Hi all, until Navision 2.60 there was a known performance problem in the dataport (during import): if the lenght of a single row in the text file exeeds 250 characters and the lenght of the text file to import is big (I mean over than 100MB), the import may take also several days (a customer calculated 52 days for one import!). The problem is solved if you decompose your file in more files with the lenght of the raw less than 250 ch. ad import them separately. I think that the origin of the problem could be that Navision does not support text variable with more than 250 ch. Because in NA3.01 the maximum lenght of a text variable can be 1024 ch., does anybody knows if the performance problem is still alive? Thanks Marco
In 2.60 I had simular problems. My solution was not to use a dataport, but a report using FILE.OPEN and FILE.READ with FILE.TEXTMODE(FALSE). And I read character by character into a CHAR-datatype. Then, of course, I have to handle the imported data manually. This method increased my import-routine considerabely.
Hi, the following posts mentions the performance of dataports vs. reports/codeunits. http://www.navision.net/forum/topic.asp?TOPIC_ID=3595&FORUM_ID=9&CAT_ID=3&Topic_Title=Runtime+of+Dataports+and+Reports&Forum_Title=Attain%2FFinancials+%2D+Developer+Forum From own experience, in this kind of cases, it will be much faster and more flexible to handle the data import via a codeunit/report. You won’t face the text length restriction as you can read in character by character as Anfinnur suggests. Aswell you could consider to programm the “breakup” of you import file (>100Mb) into smaller pieces via C/SIDE which will make the whole process faster and easier for the customer aswell. Saludos Nils
Dataports are so slow because Navision runs a check on the file first to validate the well-formedness of the records before the real import starts. Therefore the file is being read twice. As a result the Cache will be used up very fast and the processing slows down. It is interesting to observe that Dataports are pretty fast if the filesize of the file to import is smaller than the available cache memory. Therefore splitting up files is a good idea. For this purpose I use the program File Split Master v7.1 http://www.kaboom.org.uk/magic/ With best regards from Switzerland Marcus Fabian
An easy way to solve the problem with big files is importing into access the file and using an ADO automation for importing from access instead of from the text file. The speed will remain all time the same and the process speeds a lot. Also importing to an access file if your text file is a fixed one it’s really easy. Regards, Alfonso Pertierra (Spain)email@example.com