Speeding problem with importing ASCI file

Hi all, I have a speeding problem with importing an ASCI file. The file is 109 MB. The dataport I have is very basic. It reads the text file and writes to the GenJnlLine table. The speed of the first couple of records is ok, but the speed drops very quickly to 6-7 seconds per record. What is the reason for this and is there something what I can do about it? All help is welcome. Thanks, Roelof.

Not sure but you could try 1. breaking the file into smaller segments. 2. Commiting the records periodically to clear the commit cache, but you have to check you don’t import the same record twice if you hit an error. David Cox MindSource (UK) Limited Navision Solutions Partner Email: david@mindsource.co.uk Web: www.mindsource.co.uk

Comitting the changes regularly should do the trick. However there have been reports by programmers in this forum stating that with dataports a COMMIT does NOT speed up the import of large files. You might therefore want to replace the Dataport by a codeunit or non-printing report. If you do so, please report the results to this test to this forum. ------- With best regards from Switzerland Marcus Fabian

The problem with dataports arises when: 1. You use fixed format 2. The record (Line) lenght is larger than appr. 250 Chars. It seems that NF i s reading the complete file for every record it imports. that is also why the last records are quick. Possible Solutions: 1. Do NOT use fixed format, since the variable format does not have the problem. 2. Split the file vertically (Remember to put the primairy key in both files), and then make to dataports. 3. Use some code that reads one Char at a time (See how Merge Tool does this)

I’d the same problem some time ago with files of more than 30MB… the easiest solution was using Ultra Edit for cutting the file into 5-6 files and the process speed really increases without having to use commit in between of records (First records were fast with the 30 MB files, but after a few records (1000-2000) it started going crappy… becoming faster on tests again when it was finishing…) – Alfonso Pertierra apertierra@teleline.es Spain

I have also had this problem… We switched and used a Report to do the import instead, using the FILE variable and control the Loop manually…really fast!

Thanks for all the help. It worked!

Another approach that works well in many cases to speed up importing is to deactivate all the keys except the primary key, do the import then reactivate the keys. Dave Studebaker das@libertyforever

I’ve usually gotten around this in two ways. As previously suggested, you can split the file up, and that usually helps out a lot. Another way is to set up a new primary key, and make this primary key a record number. If the app that created the txt file can do it, insert a incrementing record number in the first column. I seemed to have this problem mostly when the primary key was being duplicated in several records (such as a transaction file)