FIELD VALIDATION

Hello Everybody, i am doing some datapots and importing data in to Tables. while importing Data Dataports doesn’t run validation trigger of a field ( It means it would not execute the code in the Validate trigger of a perticular field.) so that i would like to validate fields of a table by calling validate trigger Externally. and i need to maintain a log file , if any errors were occured here i have to write into the Log file and continue the process of validation. here i need to externally validate the table Data with field by field upto total number of records.

quote:

the main aim of this problem is we have to avoid the error prompt if any errors occured while field validations.

NOTICE:

quote:

Validate function is returning void

thanks in advance for your suuggessions.

I can only say: Don’t use dataports. If You want to use a dataport. Put all Your code in the PreDataItem-trigger and read Your file using your own code (MyFile.READ, WHILE NOT MyFile.EOF and such). Gives You much better performance and control over what’s going on while importing. I only use dataports for on-time imports into empty tables.

You may also do your validations in a codeunit which you can call in OnAfterImportRecord as “IF CODEUNIT.RUN(…”. If an error occur, the codeunit will return FALSE.

Hi there, Take a look at the properties of a Dataport Field (press Alt V-P on a field) and you’ll observe the property (excerpt from Online Help): CallFieldValidate Use this property to determine if the OnValidate trigger for the field will be executed when a field is imported. Applies to Dataports Settings The CallFieldValidate settings are: To… Choose… Execute the trigger Yes Not execute the trigger No (default) Alternatively you can do the validation in the OnBeforeEvaluateField thrigger of each field.

Hello SV, thanks for u r information. But, here i would like to maintain a log file to know about which Record and which field is getting error. ok. So i have to call the Trigger externally and if the validation fails i need to write down into a LOG File and continue with out any trouble to the user. Later the User will see the LOG File and he may do some othe thing. in all records i need to validate each and every field, if any error has occured then write down into LOG and i need to go to the next Field. (i could Not miss any field). If any error occured in a perticular record i need to skip that record, write down record details into LOG and continue with the next record. i think u have understand my problem. thanks once again.

In this case danda, definitely take Lars’ advice. In your particular case you absolutely should not use Dataports, this should be written in Code. It looks like you need reliable transactional processing, and you just wont get that with Dataports.

A dataport could be used as well. But instead of importing directly into table-fields you can import into user defined variables. I usually use text-variables - e.g. in case an imported date-field is not recognized as a known date-format. Then afterwards I check the contents before assigning the tables-fields from the text-variables. And then the table-field will be validated in a codeunit. This way an error may occur, but the import does not stop. The properties AutoSave, AutoUpdate and AutoReplace are allways set to “No”. Then (of course) you have to state an INSERT yourself. Dataports save a lot of time, if you have a CSV-file - but only if the file is “clean”: no stupid quotation marks in the middle of a field or somthing like that. In that case it is easier to use manual FILE.READ() and then split manually.

I agree Anfinnur - I often use the same technique. I usually read the data into a textarray and from there check the format of each field. If the format is ok you can check the fields for logicals error and then finally save it to the record. You can keep an errorlog separately. Also I usually save into a temporay table and only save the data into real tables if no fatal errors were encountered.

quote:

I agree Anfinnur - I often use the same technique. I usually read the data into a textarray and from there check the format of each field. If the format is ok you can check the fields for logicals error and then finally save it to the record. You can keep an errorlog separately. Also I usually save into a temporay table and only save the data into real tables if no fatal errors were encountered.
Originally posted by Steffen Voel - 2005 Sep 13 : 13:03:39

But the poor performance with dataports still remain. With large textfiles it’s terrible slow. It’s a lot, lot, lot, faster reading a textfile with yur own code.

Steffan and Anfinnur, if you read danda’s post,

quote:

But, here i would like to maintain a log file to know about which Record and which field is getting error.
Originally posted by danda_kumar - 2005 Sep 12 : 00:35:22

you will see that he need to know which FIELD causes an error. You can’t do this with a Datport, since you will generate run time error if for example you import text into a decimal field. The ONLY solution in this case is to use code to import a line at a time, then OK := EVALUATE on each variable to find which one is causing the error. Of course you could use a Dataport to import every field into a temporary text variable, but that would be more complex and much slower than just writing code. I use Dataports for probably 90% of my importing requirements, but in the cases where they don’t work, I would not waste time trying to hack them, but just write code. I never use Dataports where there is a requirement of unattended operation, its just too much hassle.

David and Lars, I agree, ofcourse - It’s a tradeoff from situation to situation - depending on datasize and datacomplexity. Dataports are less complex and timeconsuming at the expense of control and speed. Judging from Dandas post (no offense), though, I think he’ll be better off using a dataport. Basically what Anfinnur and I suggest is exactly the same as reading the file with code, but perhaps a little simpler.

How would you then detect a field level error? This seems to be the key issue that he has, and you will need to do an OK := Evaluate on each field to find if the data there is bad.

I have suggested 2 ways above: 1: Reading the data directly to the record: You can trap the errors on each field using the OnBeforeEvaluate trigger on each Dataportfield, i.e.: OnBeforeEvaluateField(VAR Text : Text[260]) IF NOT EVALUATE(MyDecimal, Text) THEN BEGIN ErrorLog; Text := '0'; END;so

quote:

you will see that he need to know which FIELD causes an error. You can’t do this with a Datport, since you will generate run time error if for example you import text into a decimal field. The ONLY solution in this case is to use code to import a line at a time, then OK := EVALUATE on each variable to find which one is causing the error.

is in fact not entirely correct. It is possible to to trap those runtime errors. 2: Reading Data to Textvariables using Integer as DataItem (I Usually use Textarrays): OnAfterImportRecord FOR i := 1 to NoOfFields DO CASE i OF 1 : IF NOT EVALUATE(MyRec.DateField, MyDataportField[1]) THEN ErrorLog ELSE ValidateDateField; 2 : IF NOT EVALUATE(MyRec.IntegerField, MyDataportField[2]) THEN ErrorLog ELSE MyRec.VALIDATE(Integerfield); END;I’m sure you’ll get the picture.

quote:

… Of course you could use a Dataport to import every field into a temporary text variable, but that would be more complex and much slower than just writing code.
Originally posted by David Singleton - 2005 Sep 13 : 07:54:09

which is exactly what I just said, but I think its a silly hack of a solution

In any case, he now has options. Danda, I would recommend that you try both, since I can see you are still in the learning stages, adn you have in front of you an excellent learning opportunity.