ALL fields in NAV SQL database have not null constraint. What you should insert. depends on type of field - for text it will be an empty string. Empty date is more interesting - actually it is ‘1753-01-01 00:00:00.000’ (and for closing date the time part is last second of the date).
Be careful with fields of type code - these MUST be uppercase only !
Why yoy just don’t prepare the import dataset by means of TSQL, so it doesn’t contain duplicate / unwanted rows?
As always in such cases, I still need to warn you about the consequences of tampering with NAV tables directly by TSQ scripts, bypassing NAV logics - you must understand, WHAT are you doing and always test in non-production database. Purchase Prices might be updated in such a way, but NEVER do so with Ledgers, Journals, Cards etc!
The import comes from a flat file source. As far as I know, it is not possible to prepare it with TSQL. As an workaround I will save it in a termporarly table and prepare it there. I hoped, there would be a smarter way.
Be assured I will never do such things on main tables.
If you have a “flat” (CSV ?) file, you might consider using DataPort - there you can add some data validation code, too.
DataPort (and XMLPort) are NAV native methods of populating tables - its more safe, than direct TSQL, gives you more possibilities with data filtering/validation etc.
Besides, with DataPorts, you don’t need to worry about empty fields - NAV will handle it. Uppercase for Code fields still applies - but you can put C/AL code on dataport to correct this, if source still contains mixed case characters.
Adding to 1.): You could also enable the DB property “Maintain Defaults”. If set to YES, then default values for all table fields will be maintained on SQL site, thus, when writing data from “external” non-NAV applications you just could provide some values, anything else - the INIT values, so to speak - would be added by SQL Server internally.
Regarding 2.): If the text-file is large I would not do the import via Dataport as this is way too slow. But to keep it simple, I would also simply store the text file into a buffer table, then INSERT the data correctly into the final target table by transforming the data as you want. (I’m sure with SSIS you could also check for “doublettes”, but I have to look it up …)
Ooops, I didn’t had paid any attention to “Maintain Defaults” - turns out it is so simple [:$]
Did it now, and checked what the result is - default constraints were added to fields… Not knowing about this opportunity in NAV already “built-in”, I was thinking about doing it myself some time ago, but wasn’t sure what side effects it might cause…
regarding 2) I solved the problem using a temporary table.
Dataports I don’t like so much. Besides they are slowly I would need one for each purchaser and a codeunit to start them by LaunchUtility. Quite confusing. In SSIS I have one package with graphical interface, integrated ftp-download and the option to save data at the same time in the datawarehouse.