Speed up a SSRS Inventory Report

I have been working on an Inventory by product group report for a specific date in Visual Studio / SSRS. The report works fine when I test using a small subset of items (via a filter in the design). However when I run for all items, it tkaes hours and crashes due to a memmory error. I’m sure it is related to calculating each items inventory through the item ledger entries going back to 2004 and am wondering if there is a way I can accelerate this somehow.

Here is the code from Visual studio if needed but I’m not a programer so I wiould be limited and need a good explanation of what I need to do

SELECT [Saris Cycling Group, Inc_$Item].No_, [Saris Cycling Group, Inc_$Item].Description, [Saris Cycling Group, Inc_$Item].[Standard Cost],
[Saris Cycling Group, Inc_$Item].[Global Dimension 2 Code], [Saris Cycling Group, Inc_$Item].[Product Group Code], [Saris Cycling Group, Inc_$Item].[Sales Class],
[Saris Cycling Group, Inc_$Item].[Product Class], [Saris Cycling Group, Inc_$Item Ledger Entry].[Posting Date], [Saris Cycling Group, Inc_$Item Ledger Entry].Quantity,
[Saris Cycling Group, Inc_$Item Ledger Entry].[Entry Type], [Saris Cycling Group, Inc_$Item Ledger Entry].[Source Type],
[Saris Cycling Group, Inc_$Item Ledger Entry].[Document Type]
FROM [Saris Cycling Group, Inc_$Item] INNER JOIN
[Saris Cycling Group, Inc_$Item Ledger Entry] ON [Saris Cycling Group, Inc_$Item].No_ = [Saris Cycling Group, Inc_$Item Ledger Entry].[Item No_]
WHERE ([Saris Cycling Group, Inc_$Item Ledger Entry].[Posting Date] <= @invdate)
ORDER BY [Saris Cycling Group, Inc_$Item].[Global Dimension 2 Code], [Saris Cycling Group, Inc_$Item].No_

Kevin

Kevin,

In looking quickly, the first issue and probably the main one is that you are using the Posting date in the ‘where’ claus which is fine, but the Item Ledger Entry table by default does NOT have an index on Posting Date. So you will want to take a look at adding that Key to your Item Ledger Entry table, but test to make sure it does not hurt performance in other areas. I would add it in the NAV Interface, so you don’t have seperate SQL only Indexes out there which are difficult to keep up. By adding in NAV it will automatically add the Clustered Index of Entry No. on the end of it thus making it unique. One way to test it also, is to take your SQL Statement above, paste it into Query Analyzer, and then under the Query drop down option at top select “Include Actual Execution Plan”. Now execute the query and you will have a tab with the execution plan information which will show you if it is using an Index Scan or Seek, and you want a Seek of course or it is scanning that table over and over due to the join. It will also give hints on what index to add, but those are not always the best, as sometimes it just addes all columns as a covering index, I would try just adding Posting Date and then it will auto append Entry No. like I said and see how that works out for you. You can use Query Analyzer as described above to quickly see what performance gains you get by adding the Key, and then make sure nothing else in NAV takes any type of performance hit, which I don’t think it will.

Nick

It looks like Posting date is a key, however it is matched with Entry type, should I add Entry number with this key or make a new key Posting Date,Entry number? Here are the keys on the table Enabled Key SumIndexFields Yes Entry No. Yes Item No. Yes Item No.,Posting Date Yes Item No.,Entry Type,Variant Code,Drop Shipment,Location Code,Posting Date Quantity,Invoiced Quantity Yes Source Type,Source No.,Item No.,Variant Code,Posting Date Quantity Yes Item No.,Open,Variant Code,Positive,Location Code,Posting Date Quantity,Remaining Quantity Yes Item No.,Open,Variant Code,Positive,Location Code,Posting Date,Expiration Date,Lot No.,Serial No. Quantity,Remaining Quantity Yes Country/Region Code,Entry Type,Posting Date Yes Document No.,Document Type,Document Line No. No Item No.,Entry Type,Variant Code,Drop Shipment,Global Dimension 1 Code,Global Dimension 2 Code,Location Code,Posting Date Quantity,Invoiced Quantity No Source Type,Source No.,Global Dimension 1 Code,Global Dimension 2 Code,Item No.,Variant Code,Posting Date Quantity Yes Entry Type,Nonstock,Item No.,Posting Date,Planned Shipment Date,Location Code Invoiced Quantity Yes Prod. Order No.,Prod. Order Line No.,Entry Type,Prod. Order Comp. Line No. Quantity Yes Item No.,Applied Entry to Adjust Yes Item No.,Positive,Location Code,Variant Code No Item No.,Location Code,Open,Variant Code,Unit of Measure Code,Lot No.,Serial No. Remaining Quantity Yes Item No.,Open,Variant Code,Positive,Expiration Date,Lot No.,Serial No. Yes Entry Type,Item No.,Variant Code,Source Type,Source No.,Posting Date,Contract No. Quantity Yes Item No.,Open,Variant Code,Location Code,Item Tracking,Lot No.,Serial No. Remaining Quantity Yes Item No.,Completely Invoiced,Location Code,Variant Code,Drop Shipment Yes Posting Date,Entry Type No Lot No. No Serial No. Yes Item No.,Global Dimension 1 Code,Global Dimension 2 Code,Location Code,Variant Code,Posting Date Remaining Quantity Yes Item No.,Variant Code,Location Code,Posting Date

OK that was not readable - sorry

It looks like Posting date is a key, however it is matched with Entry type, should I add Entry number with this key or make a new key Posting Date,Entry number?

here it is a a screen shot

Kevin, I missed the Item Number; Posting Date as I was looking for the Posting Date as the first column, but in this case since the JOIN is on Item Number, that would come first, and then the PREDICATE which is the 'where ’ clause. So I tested on my machine, and even forced the second index (this is the index used when you are doing a range of Items), and it didn’t help, and in some cases actually made performance worse due to all the lookups that have to be done on the index. I will continue to look at this, and have been talking to another SQL PFE about this one to see if anything can be done, it is possible the statement may have to be rewritten. You must have a substantial amount of records in the ILE table, and since pulling all items SQL is making a decision to use the Clustered Index for the ALL Items query. I will keep looking, and see what I can come up with, and maybe someone else in the commuity will have something also, but I will work on it and let you know what we come up with.

Nick

We have 3,845,457 records and growing in the item ledger entry table and 13,000 + items.

Thanks for all the help so far Nick

One last question…in your example above, and i know it depends on the variable for the Invoice Date, but what percentage of records would you say you are selecting with the query out of the 3million plus records? Are you expecting 2 million back, 1 million, much less?

Nick

My understanding of how the nventory is calculated in NAV is that it basically takes all the records to caculate the inventory. If I put in today as the date it would basically calculate the inventory / value for every item using all the records. I don;t know of a different way to handle it right now, it’s ugly. Maybe we could have a year end total in a seperate table and then calcualte based on the year end to date vs going all the way back to 2004. I could be doing this completely wrong too …

I think the above query is only a portion then of the total SSRS report. You are doing calculations also and those are part of the SSRS report. But if you are hitting all 13,000 items, and it is scanning all 3.7million records per item, you can see that 3.7million times 13,000 brings us to a large dataset. The ‘where’ clause is only narrowing by Posted Date, so it depends on of course how many records meet the criteria for the dataset as you may not use all 13,000 items in the range, but you know probably know what I am getting at, the 3.7million is no problem, even a scan would be pretty quick, but that times the number of Items in that range and then some calculations on top of that, could cause for a very slow report with a huge dataset. You may need to look at the report itself and see if it can be redesigned based on different data, but help with that may require an application expert that knows where all the data is stored and what should be pulled to maybe reduce the dataset. I will still see about the above query, but it sounds like there could be a lot more going on in the report than just the above query.

Nick

I re-created the report using code from Rashed Amini’s Mibuso Blog:http://mibuso.com/blogs/ara3n/2009/07/04/inventory-valuation-in-dynamics-nav/

This is what it look like now and runs very fast

DECLARE @D Date Set @D = @InvDate

Select [No_] as ItemNo, [Description], [Location Code] , [Product Group Code] , [Product Class] , [Sales Class] , [Global Dimension 2 Code] , Qty, InventoryValue, ISNULL(ROUND(InventoryValue/NULLIF(Qty,0),2),0) as UnitCost
From [Saris Cycling Group, Inc_$Item],
(select [Item No_],[Location Code],SUM(Qty) as Qty, SUM(InventoryValue) as InventoryValue
From
(Select [Item No_],[Location Code],cast(sum(Q)+ sum(Q2) as float) as Qty, cast(SUM(InventoryValue) as float) as InventoryValue
from (SELECT [Entry No_], [Item No_] ,[Location Code],[Quantity] as Q,
(SELECT isnull(sum(IAE.[Quantity]),0)
from [Saris Cycling Group, Inc_$Item Application Entry] as IAE, [Saris Cycling Group, Inc_$Item Ledger Entry] as ILE2
where IAE.[Posting Date] <= @D and ILE2.[Posting Date] <= @D and
ILE.[Item No_] = ILE2.[Item No_] and
IAE.[Outbound Item Entry No_] = ILE.[Entry No_] and
IAE.[Item Ledger Entry No_] = ILE.[Entry No_] and
IAE.[Inbound Item Entry No_] = ILE2.[Entry No_] ) as Q2,
(SELECT sum([Cost Amount (Expected)]+ [Cost Amount (Actual)])
from [Saris Cycling Group, Inc_$Value Entry] as V
where V.[Posting Date] <= @D and
[Item Ledger Entry No_] =ILE.[Entry No_]) as InventoryValue
from [Saris Cycling Group, Inc_$Item Ledger Entry] AS ILE
where ILE.[Posting Date] <= @D and ILE.[Positive] = 0) as OutBoundEntries
Group by [Item No_],[Location Code]

UNION

Select [Item No_],[Location Code],cast(sum(Q)- sum(Q2) as float) as Qty, cast(SUM(InventoryValue) as float) as InventoryValue
from (
SELECT [Item No_],[Location Code],[Quantity] as Q,
(SELECT isnull(sum(IAE.[Quantity]),0)
from [Saris Cycling Group, Inc_$Item Application Entry] as IAE ,[Saris Cycling Group, Inc_$Item Ledger Entry] as ILE2
where IAE.[Posting Date] <= @D and ILE2.[Posting Date] <= @D and
ILE.[Entry No_] = IAE.[Inbound Item Entry No_] and
IAE.[Outbound Item Entry No_] <> 0 and
IAE.[Item Ledger Entry No_] <> ILE.[Entry No_] and
IAE.[Item Ledger Entry No_] =ILE2.[Entry No_]) as Q2,
(SELECT isnull(sum([Cost Amount (Expected)]+ [Cost Amount (Actual)]),0)
from [Saris Cycling Group, Inc_$Value Entry] as V where V.[Posting Date] <= @D and
[Item Ledger Entry No_] =ILE.[Entry No_]) as InventoryValue
from [Saris Cycling Group, Inc_$Item Ledger Entry] AS ILE
where ILE.[Posting Date] <= @D and ILE.[Positive] = 1) as InBoundEntries
Group by [Item No_],[Location Code]) as CombinedEntries
Group by [Item No_],[Location Code]) as Detail
Where Detail.[Item No_] = [Saris Cycling Group, Inc_$Item].[No_]
Order by [No_]

Why don’t you use a SUM instead of a join. Which version of NAV is this, pre or post VSIFT?

Kevin,

That makes sense, the SUM operators make it make a lot more sense then the earlier single query. I assume the other logic was in the report and it had to get the entire dataset put together to do the SUM operators, and that dataset was just too large for the machine to handle. I am glad you figured it out, and usign the new query must take the place of most of the logic that was built into the SSRS report, so that is good.

Nick

You are welcome.