Table Import article continues to gain traction

October 31, 2008

The folks at have picked up on my Table Import article and are running a header in their customary “From the Blogs” section.

Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC


Developing for Dynamics GP – Weekly Summary

October 30, 2008

This was a very active week over at Developing for Dynamics GP. David Musgrave brings 6 excellent articles that, once again, cover a number of interesting topics from ways to execute scripts across multiple GP company databases, all the way to the impact Microsoft’s new Statement of Direction for Dynamics GP will have on everyday product features. So let’s get started!

Article 1: David first article — Running SQL scripts against all GP company databases — explores a batch file he had developed in conjunction with his friend Robert Cavill in the past. This batch file makes use of the OSQL command line utility application (OSQL.EXE) provided with SQL Server to execute a query against all Dynamics GP company databases. In summary, a T-SQL SELECT statement is executed against the Company Master table (DYNAMICS..SY01500) to retrieve the INTERID column values. These values are then used to execute a script repetitively for each company.

Remember, you can always achieve the same thing via T-SQL in Query Analyzer (SQL 2000) or SQL Server Management Studio (SQL 2005). However, this approach involves the use of cursors, as follows:

-- retrieves a list of customers from all companies
declare @companyID char(6)
declare c_company cursor for
select INTERID from SY01500

open c_company
fetch next from c_company into @companyID
while @@fetch_status = 0
EXEC ("select * FROM " + @companyID + "..RM00101")
fetch next from c_company into @company ID

close c_company
deallocate c_company

Article 2: How many times have you wished you could just record a macro and just execute it with a large dataset? Most of us, who have been in the trenches for a while, have known of a little dirty secret for quite some time now involving the use of Microsoft Office Word’s Mail Merge functionality to merge datasets into a macro file. In fact, macros are used to stress-test Dynamics GP and you can leverage this feature to do your own stress test.

David previously wrote KB article 953437, but have decided to bring it to the light on his blog site under the title How to Use Word Mail Merge and Macros to Import Data. Be sure to check this out as it is a unique chance to explore another interesting way of importing data and or stress testing your system.

If you are still looking for other ways of importing data into GP, please see my previous article on Table Import.

Article 3: Using VBA with Report Writer comes pack with a complete explanation of the Report Writer bands and how these are associated to Visual Basic for Applications (VBA) events. As David pointed out, “Most people are aware that you can use Visual Basic for Applications (VBA) with Microsoft Dynamics GP forms and the Modifier, but not everyone is aware that VBA can be used with the Report Writer as well.“, and it’s a shame because Report Writer, while very ‘primitive’ in it’s behaviour and architecture, still offers a wealth of possibilities over the more commercial reporting tools. Be sure to check out (as in try) the sample code and evaluate how this can be implemented in your future Report Writer projects.
Don’t forget to check other links posted by David in this same article with tons of examples on how to access data and expose that data onto Report Writer reports.

Article 4: Using ADO with VBA with Report Writer showcases a sample on accessing data stored in tables that cannot be easily linked using standard Report Writer table relationships. By now, many of you probably know that Report Writer only support one 1-to-Many table relationship on a report, which can be a serious limitation for more complex reports. However, the use of old fashioned calculated fields, little VBA, and the new UserInfo connection object can turn Report Writer into a very dangerous tool — just kidding about the dangerous piece :-).

If you are not to engraned with the terminology, be sure to check Microsoft’ ActiveX Data Objects (ADO) frequently asked questions page, or you can download the latest copy from here.

Article 5: I ran across this page last Saturday when working on my Table Import article, trying to dig up all SDKs and did not think of posting a blog about it. However, David was clever enough to put together blog with links to the Developer Documentation for Microsoft Dynamics GP page for releases 9.0 and 10.0.

Article 6: “Should I continue to develop in Dexterity?” That is the eternal question that customers and frankly speaking, developers around the world continue to ask as GP evolves more to a collaborative environment. David answers this by pointing out interesting features highlighted in the latest Statement of Directions for Microsoft Dynamics GP release, as it relates to developers.

From personal experience, let me tell you: Dexterity developers are in high demand! Even Microsoft is looking for a few good ones. So don’t get discouraged — but don’t fall asleep either — if you see everyone else shooting to learn Visual Studio. Dexterity is not going away anytime soon. I promise!

Hope you enjoy this explosion of articles and let David or myself know what you would like to see on our sites.

Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC

Mark Polino on allocating payroll benefits to projects

October 27, 2008

I hope you have been following the series of interesting Project Allocation articles with Mark Polino, MVP. In this third installment, Mark explores the allocation of payroll benefits across projects in Project Accounting guiding you through with a full example with detailed steps and screenshots making sure you are clear on the process.

The following is the complete list of topics already covered by Mark on the subject of allocations:

· Project Accounting Benefit Allocations
· Project Accounting Unit Allocations
· Project Accounting Expense Allocations

Be sure to read these articles to enhance your knowledge on the Project Accounting series, but also to be familiar with some of the core features delivered with Feature Pack 1.
Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC

The often overlooked, yet powerful Table Import

October 24, 2008

Before Integration Manager, eConnect, Web Services, SnapShot, SQL Server Data Transformation Services (DTS), SQL Server Integration Services (SSIS), or any of the supercharged, techno-geek tool you can quickly think of, there was Table Import. Considered by many at the bottom of the food chain when it comes to integration tools, the fact is, it still holds its weight in today’s XML-plagued world.


In past releases of Microsoft Dynamics GP, Table Import was an absolute best (and fast!) approach to import data, especially because tools like the ones mentioned above have never been able to cover the entire spectrum of Microsoft integrating products and third party applications available. Take for example Manufacturing or Field Service. While these products have long been around, there’s little in the form of tools that can actually get data into their tables in a safe and validated way.

You may be thinking or asking, “well, how do I know what tables I need to import data into?”. The fact is, Table Import does require an understanding of GP’s table structures and their relations — this includes all integrating solutions too! However, Microsoft has put great emphasis in providing Software Development Kits (SDKs) that outline these tables and their columns, and in particular what values are required to be passed in a record for it to be valid.

Table Import Overview

The following is an example that will import a few customer records using the sample records in the Customer.txt file under the Integration Manager samples directory.

1) Open Table Import. Go to Microsoft Dynamics GP > Tools > Integrate > Table Import. This will open the Table Import Definition window. One advantage of the tool is its ability to save import definitions. For this example, I will use CUSTOMER.

2) Select a Source File Format. Table Import supports files that have been formatted as comma-delimited or tab-delimited. The sample Customer file is a tab-delimited file.

3) Choose the Source File. Click on the folder button, then locate the file to import. The source file for this example can be located under:

C:\Program Files\Microsoft Dynamics\Integration Manager 10\Samples\Customer.txt

4) Select a Destination table. Click on the Ellipses button to open the Chose a Table window. For this example, we will select the RM Customer MSTR table. All columns in the table will be displayed in the scrolling window.

NOTE: Knowing your tables in any product will facilitate this process.

5) Map each source column in the file to the destination column in the table. Highlight each row, then click on the ellipses button next to the Source prompt on the scrolling window header to select a column from the source file. If you need to map a constant value, enter it in the Constant field on the window, then click Add.

NOTE: For the customer class, I will use the constant USA-ILMO-T1.

Before processing the import, the definition window will look like this:

6) Process the import. Lets go ahead and click on Import to bring in our records, choosing to Save when prompt to save our import definition.

Table Import will provide a status of the import while creating a rejection file. The rejection file contains the records that could not be processed and can be used to re-import the exceptions.

7) Run Check Links on the appropriate tables to build any missing records in related tables. This is quite critical, since most tables are inter related. The check links process will attempt to build those missing references.

The above illustration shows a check links executed after the import. In this case, the customer summary records and address record have been created.

Table Import can be an effective way for the end-user with some tech savvyness to get some data quickly into GP. Don’t let the overwhealming amount of tools out there shy you away from using it, especially when these tools are not able to address parts of the application you are interested in integrating data into. Be cautious of the limitations — data validation being one of them — and arm yourself with all information possible before attempting any data import. Be sure to validate your data externally and apply common sense to ensure a safe import. Run check links and reconciliation where possible and if provided by the ISV or if importing into standard GP tables. But be sure to check the following resources.

NOTE: if importing data into third party tables, be sure to work closely with the product’s ISV. They are better equipped to guide you and help you through the process.

Software Development Kit (SDK) Resources

Microsoft Dynamics GP
Microsoft Dynamics GP v10 Software Development Kit — PartnerSource, CustomerSource
Microsoft Dynamics GP v9 Software Development Kit — PartnerSource, CustomerSource
Microsoft Business Solutions – Greate Plains 8.0 SDK — PartnerSource, CustomerSource

Field Service
Software Development Kit (SDK) for Field Service 8.0 — PartnerSource, CustomerSource

Manufacturing Order Processing SDK for Great Plains 8.0 — PartnerSource, CustomerSource

Other Useful Resources

Many of my fellow MVPs have also blogged at some point on table integrations:

David Musgrave, MSFT. David recently published an article with 14 different ways of obtaining table information in Dynamics GP. This article also includes links to other blogs were this topic has been discussed.

Victoria Yudin, MVP. Check her series of articles on GP Reports. Victoria provides complete details on some of the tricky flags that exists in some of the GP tables, to be considered when importing more sophisticated data, such as Sales Orders, Purchase Orders, etc.

Mark Polino, MVP. Mark has a posted a few downloads on his blog page. Take a look at his GP 10 Table Reference and GP 9 Table Reference Microsoft Excel files.

Former MVP, Richard Whaley continues to deliver some of the best books in the market on everything GP. In particular, you won’t want to miss the Information Flow and Posting title. If you are interested, just click on the Accolade Publications link on the right of my blog.

[10/27/2008 – UPDATE] Tools Resources
This section has been added to include other tools that allow users to import/export data into GP, but that provide table information as well:

SnapShot – Click here to download SnapShot from David Musgrave’s blog site. SnapShot works by copying the contents of selected tables to Ctree files in a separate folder, thus creating a SnapShot of the data. This separate folder can then be copied to a target system and the data inserted back into the actual tables.

Support Debugging Tool – I have written extensively in the past about this debugging tool. SDT has an XML import/export feature that allows users to export the data of a table into XML format and reimport it back. Click here to find all links to download SDT.

I will continue to update this article with more and more resources, so be sure to check regularly. However, feel free to submit your own resources by dropping a comment with a link to them.

Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC

Filtering Third Pary Lookups with Dexterity

October 23, 2008

In the past days I had been struggling with a programming issue that I thought could be resolved very easily with standard cross-dictionary development techniques. In fact, I was convinced that most of what I needed could be found in past materials published by my dear friends David Musgrave and Mark Rockwell. In reading through the material and going through the recipe of steps to implement third-party lookup filtering, I realized the technique was using reject record statements to avoid displaying records in the scrolling window, and while I was able to implement it, performance quickly became an issue.

Business Situation

In principle, I am developing a new piece of code for one of my clients that would allow them to transfer Project Accounting contracts to Field Service contracts. However, I needed to limit the contracts in Field Service only to the contracts for the selected customer.


With the implementation of the technique outlined in Pushing the Limits III, the lookup would deliver the filtered results in over a minute! Not acceptable for me, not acceptable for the end-user.

Previously, I had filtered the Item Lookup window using a range where statement based on a few conditions my client needed to setup in Field Service, but this was much easier since the window resided in the DYNAMICS.DIC dictionary file. I was in fact sure that the technique outlined in Pushing the Limits III could be modified to use a range where statement, but was not sure how to accomplish it since the main issue was capturing a reference for the SVC_Contract_HDR (SVC00600) table buffer associated with the form, not any table buffer. To this point, I called my friend Mark — David was sleeping at the time given the time difference — and he pointed out that the client could use SmartFill to lookup any value typed in the Contract Number field associated to the lookup and that SmartFill would display its own lookup. Since my client owns SmartFill, this was certainly an option. However, this approach implied living with my inefficient lookup approach still, since I could not disable my lookup button.

Off I went to chat with David at COB and he pointed out that he had done this before. We started to brainstorm on the technique and he then realized that there were some challenges, but more than anything, realized that the Pushing the Limits material did not effectively address filtering third party lookups.

Giving David’s nature, her came up with perhaps the ultimate article in Cross-Dictionary Third Party Lookups Filtering. Once again, David proves that the boundaries are just in your head and that the power of Dexterity as a customization tool truly relies on the ability to work around what can be many times considered as impossible.

Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC

Mark Polino on Unit Allocations in Project Accounting

October 20, 2008

In his series of Weekly Dynamic articles, fellow MVP Mark Polino continues to explore Project Allocations — he first delivered an introductory overview on Project Allocations a couple of weeks aback. In this installment, Mark dives into how to allocate project costs based on units. He delivers important pointers on how to setup unit based allocations and clearly explains the options available as well as the pitfalls when configuring the associated cost categories.

Please stop by Mark’s blog and let him know what you think about his article. Remember: these features have been delivered as part of Microsoft Dynamics GP Feature Pack 1. You must install these features from the DVD image.

Until next post!

Mariano Gomez, MIS, MVP, MCP, PMP
Maximum Global Business, LLC

Developing for Dynamics GP – Weekly Summary

October 16, 2008

Ok, this may be funny, but trust me, if you have to keep up with David Musgrave, Mark Polino, Victoria Yudin, and all the other talented Dynamics GP bloggers out there you would soon understand why it’s best to do a summary on these guys posts. So today, I will summarize David’s post for the week.

Ever wonder why you receive an Illegal Address for field ‘PowerUser’ exception? Are you going balistic trying to find Dex.ini file after your upgrade to Dynamics GP 10.0? What are you going to do with your Dexterity customizations now that Micrsoft has changed up the Sales Transaction Entry interface buttons for the more sexy SOP Action button? Keep wondering why your DEXSQL.LOG file refers to an inexisting desSPRkmhBBCreh column in some table? Well, all answers are at Developing for Dynamics GP! Be sure to check David’s posts and drop him a note on what you think about the articles.

For more information on some of the above topics please check my articles on Dex.ini and upgrading your VBA customizations to address the new SOP Action button.

Until next post!

Mariano Gomez, MIS, PMP
Maximum Global Business, LLC