Running Fixed Assets Depreciation causes Microsoft Dynamics GP to "hang"

September 19, 2011

I just completed a full upgrade of Microsoft Dynamics GP from version 9 to version 2010 R2 for a client and they were going through their first month-end closing in the upgraded system. 3 weeks ago, after the upgrade, they reported experiencing an issue running Fixed Assets depreciation from two laptop computers, where apparently, when running depreciation the system would hang. The only option to recover would be to terminate the Dynamics.exe process from Task Manager. Nonetheless, we did not pay much attention to this at the time since the process was completed successfully from another machine, just in time to close the month of August – more on this later.

The client called back on Thursday morning, letting me know they were ready to run Fixed Assets depreciation again, and this time I offered to be onsite to see the problem first hand. So effectively this past Friday morning I drove to their location and stood behind the Sr. Accountant to see the process in action and spot any possible issues while there. The accountant proceeded to log into the company database for which he would run the depreciation, entered his September cutoff date and clicked on the Depreciate button… as luck would have it with some support cases, nothing happened and the process completed successfully. Well, after some chuckles and the typical apologies from the client, I was back in my car on the way home.

Fixed Assets – Depreciation Process Information


Halfway through, I received an email saying that as soon as I left, they logged into another company and were able to reproduce the hung up issue.

Now, I began playing all the typical troubleshooting plots in my head… the problem happens only in one company, the problem can be reproduced by all users, the problem can be reproduced on all machines. Typically, when an issue is constrained to one company, it’s related to some problem with the data or the way that company is configured. Not a bad proposition since I was only dealing with some 300 assets… but I am in my car, remember? So I offered the client to look at the issue when I was back in front of my computer, since I had discarded a user or workstation being the culprits.

Back at home I VPN’d into their system, then RDP’d to the SQL Server. I had the Sr. Accountant log into GP and start the depreciation process again. In troubleshooting the issue, I could see that the depreciation process was being correctly added to the Process Monitor and that the process showed Active, but it did not seem to complete.

Process Monitor

I also ran a SQL Profiler and noticed that the same set of T-SQL instructions would appear to be processed over and over at the database level. This told me the depreciation process was in an endless loop of some kind and something was preventing it from finishing.

SQL Profiler Trace

I then offered to run the process from the server with the ‘sa’ user and noticed that the depreciation was stopping on a particular asset ID (by clicking on the Progress button). This was now promising, because I now had a piece of data to look at.

Fixed Assets Progress window

I queried the Asset Master table and noticed that this particular asset had an acquisition cost of zero. In looking at the Asset Book, I noticed that the Cost Basis was USD $.01 (1 penny). Not sure why this grabbed my attention, but I asked the Sr. Accountant why had they set this asset up this way and he replied that they did it only to record the asset and keep track of its location, but that it had been fully depreciated in the past.

Asset General Information
Asset Book

He also added that the process was working fine in GP 9.0

So I figured I would try something by changing the Depreciation Method to “No Depreciation”. After all, if the asset had an acquisition cost of zero and a Cost Basis of 1 penny, what was there to depreciate? I ran the following statement to change the Depreciation Method to “No Depreciation”:

-- Created by Mariano Gomez, MVP
-- This code is licensed under the Creative Commons
-- Attribution-NonCommercial-ShareAlike 2.5 Generic license.

-- Remove the lock for the book being depreciated
DELETE FROM FA40203;

-- Change the depreciation method
UPDATE FA00200 SET DEPRECIATIONMETHOD = 15 WHERE COSTBASIS <= .01;

I then asked the Sr. Accountant to re-run the process and this time it completed in less than 10 seconds and produced the reports he was expecting.

Since it was not enough to fix this issue, I went out to the Microsoft Dynamics GP Partner Online Technical forum and found a case where a partner reported having the same issue at her client’s site. It seems Microsoft has identified and logged this as a problem report, but no concrete fix date has been given for it. So for now, the above query should do.

Also, you could end up with a cost basis of 1 penny at the end of the useful life of an asset, which would throw the system into an endless loop if you attempt to depreciate such assets once more. If you feel this is your case, the above script should also correct the problem.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/


Getting the Next Voucher Number for a Payables Transaction Integration

September 7, 2011

At the beginning of the year, I wrote a 2-part series on retrieving document numbers assigned by Microsoft Dynamics GP when a field rule is set to Use Default in Integration Manager, see:

IM – Retrieving document numbers assigned by Microsoft Dynamics GP when field rule is set to Use Default in Integration Manager – Part 1

IM – Retrieving document numbers assigned by Microsoft Dynamics GP when field rule is set to Use Default in Integration Manager – Part 2

Back then, I was addressing an issue faced by many working with integrations that require you to pass back some value to a source system and the complexities involved in retrieving those values, especially when the field rule is set to Use Default, this is, Microsoft Dynamics GP is assigning the field value, not the integration (via source file or otherwise).

Today, I am looking at a slightly different issue, nonetheless, related.

In this occasion, the client wanted to retrieve the next voucher number before hand for a payables transaction integration and assign it to the voucher number field, but did not want to have to write their own script. In essence, they wanted to leverage whatever mechanism built already in Microsoft Dynamics GP’s business logic to get the next voucher number.


Payables Setup Options – Next Voucher Number field

 Indeed, writing their own code would involve retrieving the field value and incrementing the numeric part of the string. This sometimes can be a nightmare, especially when having to deal with record collisions and users accessing the system while the integration was running.

In doing some additional reading, I realized that eConnect already included this method, so all I had to do was find the SQL Server stored procedure to get the next voucher number. That stored procedure is conveniently named taGetPMNextVoucherNumber. One down, 2 more to go!

The second challenge with eConnect stored procedures is to determine the parameters that need to be passed in, but all eConnect stored procedures are created with encryption, so editing them was not an option. However, I remembered that in SQL Server Management Studio, you have the ability to execute a stored procedure from the Management Studio UI and that this would in effect display a window with the parameters, furthermore detailing data types and whether they are input or output type parameters.

Execute Stored Procedure option (Right-click)

Execute Procedure window

The good thing about this window is you can enter values for your input parameters and click on OK, and SQL Server will automatically generate a template for executing the stored procedure, with variable declarations, types, and all. The construct looks something like this:

USE [TWO]
GO

DECLARE @return_value int,
@O_vCNTRLNUM varchar(21),
@O_iErrorState int

EXEC @return_value = [dbo].[taGetPMNextVoucherNumber]
@O_vCNTRLNUM = @O_vCNTRLNUM OUTPUT,
@I_sCNTRLTYP = 0,
@O_iErrorState = @O_iErrorState OUTPUT

SELECT @O_vCNTRLNUM as N'@O_vCNTRLNUM',
@O_iErrorState as N'@O_iErrorState'

SELECT 'Return Value' = @return_value
GO

This was fantastic, because now I did not have to struggle with understanding what needed to be passed in. It so happens that the control type parameter, @I_sCNTRLTYP, requires a zero to retrieve the next voucher number. In essence, I played with the parameter value and compared to what I was seeing in the GP interface (above), so here are the parameter values accepted for control type:

0 – Next Voucher Number
1 – Next Payment Number
2 – Next Alignment Number

Two down, 1 more to go.

Finally, the rest is putting the scripts together in Integration Manager to call the stored procedure.

As a best practice,  I tend to make the connections to the database persistent throughout the integration. This assures me that connections are only opened once, and closed at the end of the integration, improving the overall performance of the integration and reducing the points of failure. So, as you can imagine, a before document or a field script aren’t the places to open and close connections, as these events occur over and over, based on the number of records being integrated.

I typically open the connection in the Before Integration event script, so this is what this script looks like:

' Created by Mariano Gomez, MVP
' This code is licensed under the Creative Commons
' Attribution-NonCommercial-ShareAlike 2.5 Generic license.
'
' Persisting ADO connection

Const adUseClient = 3
Dim oCn

Set oCn = CreateObject("ADODB.Connection")
With oCn
.ConnectionString = "database=" & GPConnection.GPConnIntercompanyID
.CursorLocation = adUseClient
End With

GPConnection.Open(oCn)
SetVariable "gblConn", oCn

Note that the connection object is stored in a global variable, gblConn, using the SetVariable statement in Integration Manager.

Once we have the connection piece sorted out, now we can focus on adding code to the Voucher Number field script to call the eConnect stored procedure, as follows:

'
' Created by Mariano Gomez, MVP
' This code is licensed under the Creative Commons
' Attribution-NonCommercial-ShareAlike 2.5 Generic license.

' Prepare the SQL statement and retrieve the next voucher number
Const adCmdStoredProc = 4
Const adVarchar = 200
Const adInteger = 3
Const adParamInput = 1
Const adParamOutput = 2
Const PMVoucher = 0

Set oCmd = CreateObject("ADODB.Command")
With oCmd
.ActiveConnection = GetVariable("gblConn")
.CommandType = adCmdStoredProc
.CommandText = "taGetPMNextVoucherNumber" 'the eConnect stored proc

.Parameters.Append .CreateParameter ("@O_vCNTRLNUM", adVarchar, adParamOutput, 25)
.Parameters.Append .CreateParameter ("@I_sCNTRLTYP", adInteger, adParamInput)
.Parameters.Append .CreateParameter ("@O_iErrorState", adInteger, adParamOutput, 4)

oCmd.Parameters("@I_sCNTRLTYP").Value = PMVoucher
.Execute
NextVoucher = oCmd.Parameters("@O_vCNTRLNUM").Value
CurrentField.Value = NextVoucher
'MsgBox NextVoucher
End With

Set oCmd = Nothing

Note how in this occasion, we are using the GetVariable function to retrieve a pointer to the connection object stored in the global variable. We then access the Parameters object to add the different parameters and set the value for the input parameter to the stored procedure.

When this script is executed within the context of the integration, it effectively returns the next voucher number for the transaction being integrated, from which you can proceed to update this information in your source system, if needed.

Note that by using standard Microsoft Dynamics GP business logic, your integration can now be supported if you need to open a support incident. Indeed another method for retrieving a document number for your transaction.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/


Microsoft’s upcoming products available for download

July 13, 2011

A bit away from the Microsoft Dynamics GP world, but at the same time so close, here are some very good news on the upcoming release of Microsoft SQL Server “Denali”, Office 2010 and SharePoint, and Internet Explorer.

Microsoft SQL Server Code Name 'Denali'

Download SQL Server Code Name “Denali” CTP3

SQL Server Code Name “Denali” Community Technology Preview
(CTP3) provides the foundation for the cloud-ready information
platform, and will help customers unlock insights across the
organization and quickly build solutions that extend data across
on-premises and public cloud.

Microsoft Office 2010

Office 2010 and SharePoint 2010 Service Pack 1 Now Available

Service Pack 1 is now available for the 2010 family of products, including Office, SharePoint, Project Server, FAST, Search Server, and Groove Server. For more on deployment, see these articles: Apply Office 2010 Service Pack 1 and Deploy Service Pack 1 for Project Server 2010. Also, download the Service Pack 1 for SharePoint Foundation 2010 and SharePoint Server 2010 White Paper.

Internet Explorer 10

Internet Explorer 10: Platform Preview 2

The latest preview build of Internet Explorer 10, Platform Preview 2,
is now available for download. With the second Platform Preview,
developers can start working with several site-ready HTML5
technologies.

Remember… should you decide to try any of this code, do so in a test environment.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/


Violation of PRIMARY KEY constraint ‘PKSY60100’

July 6, 2011

Just recently, I assisted a partner with an issue they were having creating a new company in Microsoft Dynamics GP 10 – though, I supposed the same could happen with any other version. In the process of creating the company record, almost at the end of all the routines executed by Dynamics Utilities they were getting the error:

Violation of PRIMARY KEY constraint ‘PKSY60100’. Cannot insert duplicate key in object ‘dbo.SY60100’

KB Article 871699 Secure Link suggests the problem could be that the DYNAMICS database is associated with a database owner (dbo) other than DYNSA. After running the sp_helpdb system stored procedure, it was determined that the database owner of the DYNAMICS database (and other company databases) was indeed ‘sa’. Knowing this obviously helps, and the solution is as simple as changing the database owner back to DYNSA.

The partner then ran the sp_changedbowner system stored procedure to reset the database owner to DYNSA and got the following error:

Msg 15110, Level 16, State 1, Line 1
The proposed new database owner is already a user or aliased in the database

Having gotten this error, we proceeded to drop the DYNSA from the DYNAMICS database as follows:

USE DYNAMICS;
GO
DROP USER DYNSA;
GO

Having dropped the user from the database, we needed to re-add DYNSA as the database owner of the DYNAMICS database. This time, I decided to try the new ALTER AUTHORIZATION statement as the customer is running Microsoft SQL Server 2008 R2, as sp_changedbowner will be deprecated from SQL Server sometimes soon.

ALTER AUTHORIZATION ON DATABASE::DYNAMICS TO DYNSA;

Once we executed this command, we restarted the company creation process in Utilities and the error was no longer.

If you find yourself in a similar situation then this should definitely help.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/


Microsoft Dynamics GP "12" Named System Database Architecture

May 23, 2011

I received a lot of feedback from the community as a whole on the 3-part series of articles on Microsoft Dynamics GP “12” Web Client Architecture and I was pleased to know that many of you are embracing the fact that there will be a Web client version of the product and are asking numerous questions about readiness.

While these articles addressed the client portion of the solution, I really did not mention anything about changes in the database architecture and how these will impact the future deployment options. So, I have released a new architecture article on Microsoft Dynamics Community, this time addressing Microsoft Dynamics GP “12” Named System Database Architecture.

In this article I look at the named system database capabilities to be released in GP “12”. This is, the ability to set any name to the traditional, hardcoded DYNAMICS database. Hope you enjoy the article and if you have any comments or questions please feel free to post back.

For convenience sake, I will be adding a new link to the architecture series.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/

Edits:

06/06/2011 – Changed article title to fit instructions provided by Microsoft Escalation team and current developments out of Fargo.


Upgrade Strategies for Microsoft SQL Server 2008

May 20, 2011

Back in June of last year, I talked about Microsoft SQL Server 2008 Upgrade Advisor and how it can help in identifying potential upgrade issues when moving from earlier versions of SQL Server – see Microsoft SQL Server 2008 Upgrade Advisor.

Bottom line, upgrading your database server requires careful planning. If you are still in the process of considering upgrading to SQL Server 2008 and are trying to workout whether to do an In-Place Upgrade versus a Side-by-Side Upgrade, then you will want to read Arshad Ali‘s article on the subject over at SQL Server Central.

Upgrade Strategies for SQL Server 2008

Arshad outlines why you should upgrade, the different strategies available, how to choose an upgrade method, putting a risk mitigation plan in place for your upgrade, the sequence of events, and much, much more, including how to read the results of the Microsoft SQL Server 2008 Upgrade Advisor tool. Go and take a look.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/


>Microsoft SQL Server performance boosting settings for Microsoft Dynamics GP – Part 1

February 17, 2011

>Part 1 of 2 – Microsoft SQL Server performance boosting settings for Microsoft Dynamics GP.


Disclaimer: this is not a “one-size-fits-all” recipe for improving Microsoft SQL Server performance and you will hear me say numerous times that you must consult with your Microsoft SQL Server administrator to understand how these settings may (or may not) apply to your environment.

In addition, I have recommended (and implemented) some of these changes in different customer environments on a case by case basis, but have never had to make them all at the same time, so please don’t take this as a “Mariano said I must make all these changes to have my Microsoft Dynamics GP SQL Server fly” message. To the contrary, this post is intended to present some settings that may contribute to improve SQL Server performance, based on your specific environment. So here is my list:

Minimum memory per query
Increasing the value of this option can improve the performance of queries that use hashing or sorting operations, like those find in SmartList and SmartList Builder. Also, most table auto generated stored procedures — the stored procedures prefixed with zDP, see David Musgrave’s article, What do the zDP_ Auto Generated Stored Procedures do?, for more information– will benefit from this change if your SQL Server has a lot of memory available and there are many queries running concurrently on the server.


Server Properties | Memory



SQL Server will automatically allocate, at a minimum, the amount of memory set in this configuration setting. The default value for this option is 1024 KB.

Network Packet Size
Use the network packet size option to set the packet size (in bytes) used across the entire network. Packets are the fixed-size chunks of data that transfer requests and results between clients and servers. If an application does bulk copy operations, or sends or receives large amounts of text data, a packet size larger than the default may improve efficiency because it results in fewer network reads and writes. If an application sends and receives small amounts of information, you can set the packet size to 512 bytes, which is sufficient for most data transfers.


Server Properties | Network Packet Size


The default packet size set by Microsoft SQL Server is 4096 bytes. Microsoft does not recommend changing the packet size unless you are certain that it will improve performance.

Default Index Fill Factor
The Default Index Fill Factor option specifies how full each index page will become. When there is no free space to insert new row on the index page, SQL Server will create new index page and transfer some rows from the previous page to the new one. This operation is called page splits. You can reduce the number of page splits by setting the appropriate fill factor option to reserve free space on each index page.

Server Properties | Database Settings

The fill factor is a value from 1 through 100 that specifies the percentage of the index page to be left empty. The default value for fill factor is 0. It is treated similarly to a fill factor value of 100, the difference in that SQL Server leaves some space within the upper level of the index tree for FILLFACTOR = 0. The fill factor percentage is used only at the time the index is created. If the table contains read-only data (or data that very rarely changed, for example like Master tables), you can set the ‘fill factor’ option to 100. When the table’s data modified very often, for example like MRP tables or SOP and GL transaction tables, you can decrease the ‘fill factor’ option to 70 percent, for example.

Recovery Interval
The ‘Recovery Interval’ option specifies the maximum number of minutes per database that SQL Server needs to complete its recovery procedures. The default value of this option is 0. It means that SQL Server will automatically configure this option. SQL Server issues a checkpoint using the ‘Recovery Interval’ option. Microsoft does not recommend changing this option in general case, but sometimes you can improve performance by changing this option. You can monitor disk-write activity on the data files, and if you see periodic spikes that send disk utilization to 100 percent, you can increase the recovery interval. In this case, Microsoft suggests setting the ‘Recovery Interval’ option to 5 and continuing monitoring.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.intellpartners.com/