Quantcast
Channel: Dynamics GP Land Has Moved To --> blog.steveendow.com
Viewing all 402 articles
Browse latest View live

Code to Import a Note with an eConnect AP Invoice

$
0
0
By Steve Endow

Last week a client asked me to add a new field to their AP Invoice Import.  They wanted to import a unique transaction ID for their AP invoices, which were being generated by their operational system.

They were already using the Document Number field and the PO Number field.  So that left them with the AP Invoice Note field.  Since the unique transaction ID didn't need to be queried or searched within GP, the Note field, while not ideal, would work for them.

But there was one problem.

When I tried to modify my application to import the new value into the AP Invoice Note field, there was no NOTETEXT eConnect node.  That's odd.

I pulled up the handy eConnect Programmer's Guide, jumped to the taPMTransactionInsert documentation, and searched for "note".


No matches found.  What?  Something must be wrong.  Clearly the documentation is incorrect.

Then I checked another eConnect help file.  No note.  Then I checked the taPMTransactionInsert stored procedure.  No NOTETEXT parameter.

So, it would seem that the summer intern who developed the taPMTransactionInsert eConnect method forgot to include the transaction note.  I can't imagine any valid reason why you can't import a Note for an AP Invoice.  And it's unbelievable that eConnect has been around this long and the Note field still hasn't been added.

Although admittedly, it would seem that this is the first time I've ever needed to import an AP Invoice Note, as I don't recall noticing this issue before.

So...  What does one do when eConnect doesn't have what you need?  I call it "off roading".

But what will we need to do in order to import a note for a transaction that we are importing via eConnect?  Well, it's a bit of a hassle.  Not feeling eager to write the code to insert a note, I installed WinGrep on one of my development servers and started searching through my Visual Studio source code files to see if I could find anything.

And behold, I found it.  Back in 2012, I apparently wrote an integration that had to insert notes for cash receipt transactions.  And guess what.  eConnect does not support Notes for cash receipts!  The same summer intern must have coded taRMCashReceiptInsert as well.

So, here is the code that I wrote to insert notes for AP Invoices imported with eConnect.

And yes, I know that you could also use the eConnect Post stored procedure, but I don't like them for three good reasons.

1. Debugging SQL stored procedures is a pain compared to debugging C# in Visual Studio
2. Putting custom logic in a Post stored procedure means having code in an additional location
3. When you apply a GP update or new release, the eConnect Pre and Post procs are typically dropped, and nobody ever remembers to re-script the procs.  Been there!

Yes, I am using a stored procedure to insert and update the note, but that proc is very granular and will not be dropped by a GP service pack or new release.  You could always move the note insert proc into C#, but I don't see much benefit of doing so.


First, there is the stored procedure to perform the insert or update of the note.  It even checks which version of GP is being used and appends the appropriate line break.

IF EXISTS(SELECT * FROM sys.objects WHERE type = 'P' AND name = 'csspInsertUpdateNote')
DROP PROCEDURE csspInsertUpdateNote
GO

CREATE PROCEDURE csspInsertUpdateNote
@NOTEINDX AS int, 
@NOTETEXT varchar(8000)
AS
BEGIN
   
--SET NOCOUNT ON;
DECLARE @TXTFIELD varbinary(16);
DECLARE @LENGTH int;
DECLARE @CRLF varchar(10);
DECLARE @APPEND varchar(8000);
DECLARE @GPVERSION int;

--GP 2010+ uses CRLF for notes, whereas GP 9 and 10 use CR only
SELECT TOP 1 @GPVERSION = versionMajor FROM DYNAMICS..DU000020 WHERE COMPANYID = -32767 AND PRODID = 0
IF @GPVERSION >= 11
SET @CRLF = CHAR(13)+CHAR(10);
ELSE 
SET @CRLF = CHAR(13);

SET @APPEND = @CRLF + @NOTETEXT;

--Check if a note record exists
IF (SELECT COUNT(*) FROM SY03900 WHERE NOTEINDX = @NOTEINDX) = 0
BEGIN
--If not, insert a new note record with the update
INSERT INTO SY03900 (NOTEINDX, DATE1, TIME1, TXTFIELD) VALUES (@NOTEINDX, DATEADD(dd, 0, DATEDIFF(dd, 0, GETDATE())), CONVERT(VARCHAR(8),GETDATE(),108), @NOTETEXT)
END
ELSE
BEGIN
--If so, update the existing note
--Get the text pointer for the service call note
SELECT @TXTFIELD = TEXTPTR(TXTFIELD), @LENGTH = DATALENGTH(TXTFIELD) FROM SY03900 WHERE NOTEINDX = @NOTEINDX;

UPDATETEXT SY03900.TXTFIELD @TXTFIELD @LENGTH 0 @APPEND;
END
END
GO

GRANT EXEC ON csspInsertUpdateNote TO DYNGRP
GO



Next, there is the C# data access method to call the stored proc.

public static bool InsertUpdateNote(string gpDatabase, decimal noteIndex, string noteUpdate)
{

    string commandText = "csspInsertUpdateNote";

    SqlParameter[] sqlParameters = new SqlParameter[2];
    sqlParameters[0] = new SqlParameter("@NOTEINDX", System.Data.SqlDbType.Decimal);
    sqlParameters[0].Value = noteIndex;
    sqlParameters[1] = new SqlParameter("@NOTETEXT", System.Data.SqlDbType.VarChar, 8000);
    sqlParameters[1].Value = noteUpdate.Trim();

    int records = 0;

    try
    {
        records = ExecuteNonQuery(gpDatabase, CommandType.StoredProcedure, commandText, ref sqlParameters);
        if (records == 1)
        {
            return true;
        }
        else
        {
            return false;
        }
    }
    catch (Exception ex)
    {
        Log.Write("An unexpected error occurred in InsertUpdateNote: " + ex.Message, true);
        return false;
    }

}



Here is a helper data access method to get the note index for a voucher.  (The ExecuteScalar method is my own data access wrapper to perform an execute scalar operation.)


public static int GetAPInvoiceNoteIndex(string gpDatabase, string vendorID, string voucherNum)
{
            
    string commandText = "SELECT TOP 1 NOTEINDX FROM PM10000 WHERE VENDORID = @VENDORID AND VCHNUMWK = @VCHNUMWK AND DOCTYPE = 1 ORDER BY DEX_ROW_ID DESC";  

    SqlParameter[] sqlParameters = new SqlParameter[2];
    sqlParameters[0] = new SqlParameter("@VENDORID", System.Data.SqlDbType.VarChar, 15);
    sqlParameters[0].Value = vendorID.Trim();
    sqlParameters[1] = new SqlParameter("@VCHNUMWK", System.Data.SqlDbType.VarChar, 17);
    sqlParameters[1].Value = voucherNum.Trim();
            
    string result = string.Empty;

    try
    {
        result = ExecuteScalar(gpDatabase, CommandType.Text, commandText, sqlParameters);
        return Convert.ToInt32(Convert.ToDecimal(result));
    }
    catch (Exception ex)
    {
        Log.Write("An unexpected error occurred in GetAPInvoiceNoteIndex: " + ex.Message, true);
        return 0;
    }

}


eConnect appears to automatically assign a Note Index to AP invoices, so you don't have to get the next index or assign it, but in case it is of interest, here is my data access method to get the next note index.  (The ExecuteNonQuery method is my own data access wrapper)

public static decimal GetNextNoteIndex(string gpDatabase)
{

    string commandText = "SELECT CMPANYID FROM DYNAMICS.dbo.SY01500 WHERE INTERID = @INTERID";

    SqlParameter[] sqlParameters = new SqlParameter[1];
    sqlParameters[0] = new SqlParameter("@INTERID", System.Data.SqlDbType.VarChar, 5);
    sqlParameters[0].Value = gpDatabase;

    string result = ExecuteScalar(gpDatabase, CommandType.Text, commandText, sqlParameters);
    int companyID = int.Parse(result);

    //Call the smGetNextNoteIndex proc in the Dynamics DB
    commandText = "DYNAMICS.dbo.smGetNextNoteIndex";

    sqlParameters = new SqlParameter[4];
    sqlParameters[0] = new SqlParameter("@I_sCompanyID", System.Data.SqlDbType.SmallInt);
    sqlParameters[0].Value = companyID;
    sqlParameters[1] = new SqlParameter("@I_iSQLSessionID", System.Data.SqlDbType.Int);
    sqlParameters[1].Value = 1;
    sqlParameters[2] = new SqlParameter("@O_mNoteIndex", System.Data.SqlDbType.Decimal);
    sqlParameters[2].Direction = ParameterDirection.Output;
    sqlParameters[2].Value = 0;
    sqlParameters[3] = new SqlParameter("@O_iErrorState", System.Data.SqlDbType.Int);
    sqlParameters[3].Direction = ParameterDirection.Output;
    sqlParameters[3].Value = 0;

    decimal noteIndex = 0;
    int recordCount = ExecuteNonQuery(gpDatabase, CommandType.StoredProcedure, commandText, ref sqlParameters);
    noteIndex = decimal.Parse(sqlParameters[2].Value.ToString());

    return noteIndex;

}



I then have a data access method to get the note index of the AP invoice and update the note.

public static bool AppendAPInvoiceNote(string gpDatabase, string vendorID, string voucherNum, string noteText)
{
    bool newNote = false;

    decimal noteIndex = GetAPInvoiceNoteIndex(gpDatabase, vendorID, voucherNum);

    if (noteIndex == 0)
    {
        newNote = true;
        noteIndex = GetNextNoteIndex(gpDatabase);
    }

    bool success = InsertUpdateNote(gpDatabase, noteIndex, noteText);

    if (success == false)
        return false;

    if (newNote)
        success = AssignAPInvoiceNote(gpDatabase, vendorID, voucherNum, noteIndex);

    return success;

}



And finally, I call the AppendAPInvoiceNote method after successfully importing each invoice.

success = DataAccess.AppendAPInvoiceNote(gpDatabase, vendorID, voucherNum, noteText);
if (success == false)
{
    Log.Write("Failed to save Note for Vendor " + vendorID + " transaction " + voucherNum, true);
}



Pretty straightforward, but I am so glad I didn't have to write that from scratch...again, particularly because this was an urgent request and I just didn't have the time available.  That would have been a fair amount of research and coding just to insert a Note.  That eConnect should import for me.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Interrupting Printing Edit Lists

$
0
0
Once is a fluke. Twice makes you think about it.  Last month we had a client who had a large batch of sales transactions (think 1,000s of transactions).  They printed an edit list, and the system locked up.  So what  did they do?  Something they thought would be benign...they used Ctrl-Alt-Delete to end their GP session.  And here is where it goes awry...


When they logged back in to GP, it told them that there was a batch in batch recovery. Okay.  Fine.  So they go to recover the batch, and are MORTIFIED when it goes ahead and POSTS THE BATCH.  Before they were ready, before they had completed some additional steps to their own process.  Ugh.  Double Ugh.  Triple Ugh.


So we worked through restoring a backup, and I admittedly did not think too much about it other than it being an odd fluke.  Until it happened again, to a different client.  And then I got to thinking how it makes PERFECT SENSE.  Yes, perfect sense.


Printing an edit list uses the same report as printing a posting journal, so presumably it changes the  batch's status when printing.  So it would make sense if that process is interrupted, that the status in the SY00500  table for the batch might indicate that it was indeed in the posting process.  So batch recovery would take that information and act on it.


So what to do?  Rather than use batch recovery in this instance, I would recommend using the batch stuck scripts available in this KB article:


https://support.microsoft.com/en-us/kb/850289


Make sure to use the "Let Me Fix It Myself" option of actually running the scripts to reset the batch status and make it available for continued review/editing.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Mekorma MICR, Where Did My Stubs Go?

$
0
0
Another interesting case this past week related to Mekorma MICR Checks.  A lot of our clients use this ISV solution, which works great (admittedly we sell it often for the benefits beyond the MICR line, in some cases to clients who don't want to even print the MICR line).

This particular case is regarding a version change, we were applying the latest service pack to a client for GP 2013.  So it would move them to GP 2013 R2.  Well, with this came a change in how Mekorma MICR stores the path to the stubs library. In the original versions, paths were specific to a workstation/install.  But in GP 2013 R2 and later, the paths are a global setting (which is definitely a good thing).

In this case, the client had two machines- a SQL server and a terminal server.  And there was a GP shared location where all of the normal reports and forms dictionaries, amongst other shared resources, were stored.  The update was applied to the SQL server first, whose stubs library was pointed to the GP share.

The issue came after we updated the terminal server and found that our modified stubs were missing!  Well, as it turns out the terminal server was actually pointed local for the stubs library.  So when the global was set to the shared location (where there were stubs, since the SQL server was pointed there previously), the terminal server was no longer viewing the previously modified stubs.

Easy fix fortunately, we located the stubs on the terminal server and copied them over to the share. But an important point to note, especially if you have numerous workstation installs as well.  We are careful to check for local reports and forms dictionaries, but we are now going to check for local stubs libraries as well when applicable.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Dynamics GP VS Tools Reference Quirk: Microsoft.Dexterity.Bridge

$
0
0
By Steve Endow

I'm developing a Dynamics GP VS Tools AddIn and noticed an odd behavior.

I was trying to access the Dynamics Globals object properties to get the current company ID and system database name and store it in my Model object.

Controller.Instance.Model.GPCompanyDatabase = Dynamics.Globals.IntercompanyId;
Controller.Instance.Model.GPSystemDatabase = Dynamics.Globals.SystemDatabaseName;

I have these two lines in another project, so I copied them to my current project.


Intellisense recognized Dynamics.Globals and let me choose the two properties, but I was getting an error about type conversion to a string.

Since I have used these exact lines previously, I suspected something wasn't right with my current project.

I had a reference to Application.Dynamics, and I had this using statement:

using Microsoft.Dexterity.Applications;

Since Dynamics.Globals was being picked up by Intellisense, it seemed like my references were okay, but obviously something wasn't quite right.

Another odd thing I noticed was that if I typed a period after SystemDatabaseName or IntercompanyId, I wasn't getting an Intellisense pop up of options.


So something was wrong--clearly Visual Studio wasn't able to determine the data types for those properties.  I was able to use String.Convert to bypass the error, but it bugged me.  It seemed like there was some type of issue with my Application.Dynamics reference.

After checking my other code and trying various things, I finally stumbled across the solution.

I needed to add a reference to Microsoft.Dexterity.Bridge.


Once I added the Bridge reference, Intellisense stopped complaining about the type conversion, and I was able to get Intellisense values for SystemDatabaseName and IntercompanyId.


Only after looking at it again today did I realize that a big clue was staring right at me.


The error was indicating that it was a Dexterity Bridge data type, but I didn't think to look at that detail, and probably only in hindsight was this clue helpful.  But it explains why Bridge is required, and now I know to reference both libraries!

Happy bug hunting!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





SQL Server name limitation with GPConnNet and VS Tools

$
0
0
By Steve Endow

I previously wrote about a situation where the Dynamics GP GPConnNet library is unable to connect to a SQL Server instance if a port number must be specified.

This week I encountered a new limitation with GPConnNet and VS Tools.  A customer has been successfully using a Dynamics GP AddIn for several years, and they are now upgrading to GP 2015.  When they tried to test my AddIn on their GP 2015 test server, they received this error.


The error message says GP login failed.  But since this is a VS Tools AddIn that runs inside of GP after a user has logged in, that message doesn't make much sense.  We know that the username and password are correct.  Very odd.

We then noticed that the server name was incomplete.  The final "T" in the name was missing, and the value displayed is exactly 15 characters--more than a coincidence.  So it looks like the 16 character server name is being truncated to 15 characters, and that is likely the cause of the problem.

But wait!  If the server name is being truncated, then the server name would be incorrect.  And when the AddIn attempted to connect to that non-existent server to authenticate, the connection attempt would fail, right?  The error message would be different for a connection failure.

So back to the original error message.  It says "GP login failed", not "failed to connect" or something similar.  So this would seem to tell us that the connection was successful, but that the login subsequently failed.

What in the world?

But it gets better.

If the customer logs in to Dynamics GP using the 'sa' login, the AddIn works and does not give the "GP login failed" message.


So the sa account works, but GP users don't work.  What does that tell us?  In theory, it is a confirmation that the AddIn connection process is working, but that there is something about the GP logins that is failing.

So why would sa work, but not a GP user login?

My guess is Dynamics GP password encryption.

When you create a new user in Dynamics GP, it "encrypts" the password before sending it to SQL Server.  This prevents a user from connecting directly to the SQL Server.

My guess is that GPConnNet uses the SQL Server name in the "encryption" process, but it is truncating the server name at 15 characters for some reason, and that is the cause of this issue.  Presumably Dynamics GP does not do this, since my client is able to login to GP just fine.

So how do you work around this issue?

The best option is to make sure that your SQL Server instance names are no more than 15 characters.

The only other option I was able to come up with was to have the client create a shorter SQL Server Alias.  I then had to hard-code that shorter alias name in my AddIn.  Once I hard coded the shorter alias for the server name, the AddIn worked fine.

Why hard code, you ask?

Well, VS Tools uses the Dynamics GP Backup / Restore form in order to get the name of the SQL Server.  Even if the Dynamics GP ODBC DSN is set to use a short alias name, the Backup / Restore window will return the actual SQL Server name.  So even after the Alias was setup and the GP ODBC DSN was using it, my AddIn was still receiving a SQL Server name of MCCGP15DB01-TEST, and the login would still fail.  Fortunately, they only have this issue with their Test database server--their GP 2015 production SQL Server has a shorter name.

So, like I said, just make sure your SQL Server instance names are 15 characters or less if you are using GPConnNet.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Adding a Visual Studio snippet for a C# "public string" property

$
0
0
By Steve Endow

When I am developing in Visual Studio and adding properties to a class, by far the most common property I add is a string type property.  When developing for GP, over 90% of my properties are strings.

While I have been using the "prop" snippet to automatically type out most of the property declaration, when you have a few dozen properties, it gets tedious and repetitive having to press specify "string" over and over.


Typing "prop" and pressing tab twice will automatically create a property line ("automatically implemented property").

But, the default data type is "int", so I usually have to press TAB and then type "string", then press TAB again to name the property.

So my typing laziness finally overcame my research laziness and I actually looked up how to create a custom snippet in Visual Studio.  It's pretty easy.  Should have done it years ago.

The location of the snippets may vary by Visual Studio version and other factors, but on my machine, the snippet files are located at:

C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC#\Snippets\1033\Visual C#

I located the "prop.snippet" file, copied it to my desktop, and edited it.

Here is the original prop.snippet.  (the spaces are added to the XML tags so they don't get stripped out by Blogger)


< ?xml version="1.0" encoding="utf-8" ? >
< CodeSnippets  xmlns="http://schemas.microsoft.com/VisualStudio/2005/CodeSnippet">
< CodeSnippet Format="1.0.0">
< Header>
< Title >prop< /Title >
< Shortcut > prop < /Shortcut >
< Description >Code snippet for an automatically implemented property
Language Version: C# 3.0 or higher< /Description >
< Author > Microsoft Corporation< /Author >
< SnippetTypes >
< SnippetType >Expansion< /SnippetType >
< /SnippetTypes >
< /Header >
< Snippet >
< Declarations >
< Literal >
< ID >type< /ID >
< ToolTip >Property type
< Default >int< /Default >
< /Literal >
< Literal >
< ID >property< /ID >
< ToolTip >Property name< /ToolTip >
MyProperty< /Default >
< /Literal >
< /Declarations >
< Code Language="csharp">< ![ CDATA [public $type$ $property$ { get; set; }$end$]] >
< /Code  >
< /Snippet >
< /CodeSnippet >
< /CodeSnippets >


I edited the highlighted items, and removed the lines in red.


< ?xml version="1.0" encoding="utf-8" ? >
< CodeSnippets  xmlns="http://schemas.microsoft.com/VisualStudio/2005/CodeSnippet">
< CodeSnippet Format="1.0.0">
< Header >
< Title >props< /Title >
< Shortcut >props< /Shortcut >
< Description >Code snippet for an automatically implemented string property
Language Version: C# 3.0 or higher< /Description >
< Author >Steve Endow< /Author >
< SnippetTypes >
< SnippetType >Expansion< /SnippetType >
< /SnippetTypes >
< /Header >
< Snippet >
< Declarations >
< Literal >
< ID >property< /ID >
< ToolTip >Property name< /ToolTip >
< Default >MyProperty< /Default >
< /Literal >
< /Declarations >
< Code Language="csharp">< ![CDATA[public string $property$ { get; set; }$end$]] >
< /Code >
< /Snippet >
< /CodeSnippet >
< /CodeSnippets >



Once you have your new snippet, you can go to Tools -> Code Snippets Manager -> Import.  One recommendation when you import--only select one category.  For example, either use only My Snippets, or select only C#, etc., otherwise Intellisense will detect multiple snippets with the same name.

With my new "props" snippet imported, I save a press of the TAB key and don't have to type "string" every time.  The word "string" is now hard coded in the output and it jumps straight to the property name.


If you code properties, I recommend looking into customizing your snippets.  And I'm sure there are tons of other cool things you are do with snippets, but for now, my typing laziness is content.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Using the Dynamics GP PM00400 table for Payables Imports

$
0
0
By Steve Endow

I'm developing two different custom Dynamics GP Payables eConnect integrations for two different customers.  One of the shared requirements is to validate the data for each Payables transaction before it is imported.

The import will confirm among other things, that the vendor exists in GP, the GL distribution accounts are valid, the amounts are valid, and lastly, that the transaction does not already exist in GP.  So if Invoice 12345 for vendor ACME is already present in GP, the import should flag the record as a duplicate and skip it.

To check for a duplicate PM transaction, you could check the PM transaction tables directly.  While definitely possible, the downside to this approach is that an invoice or credit memo may be in one of three different tables:  Work, Open, and History (PM10000, PM20000, and PM30200).

All three of the tables could be queried with a single SQL statement, but there is an alternative.  Instead of querying those three tables, you can query the PM00400 "PM Key Master" table instead.

PM00400 should contain every voucher number and vendor document number process by the Payables module, so you should be able to query it to determine if a transaction already exists, and the table should tell you the document status (work, open, or history).


Above is a sample transaction, with each row being a snapshot of the data in PM00400 as the invoice moved from work to open to history.  Note that the CDSTATUS field changes from 1 to 2 to 3, and the TRXSOURCE value is populated once the transaction is posted.

So PM00400 can be a handy option to check whether a Payables transaction already exists in GP, and a quick way to verify the status of the document.

The Receivables module has a similar "RM Key" table, RM00401.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Revenue Accruals in Project Accounting

$
0
0
The conversation about revenue recognition in project accounting for Microsoft Dynamics GP often goes like this...




"How do you do revenue recognition?"




"Oh, we don't.  Just hit revenue when we bill the expenses and fees."




"Oh, okay, so no need for revenue recognition?"




"No.  But we need to accrue at month end for unbilled."







Or, sometimes, it devolves in a conversation of how to use the revenue recognition module to handle the accrual.  Which is neither fun, nor simple. Particularly because GP revenue recognition
effectively calculates one of three ways...




1. Based on contract/project/cost category progress (either by cost, or by hours)
2. When the project is complete
3. Over time when dealing with a service fee (based on duration of service fee)




But many folks don't realize that GP Project Accounting has built in functionality to deal with accruing unbilled revenue (and therefore the offset, unbilled AR as well).




In this case, you would use a Time and Materials Project.  But the difference is that you use a When Performed accounting method.  The When Performed account method, generates a credit to Unbilled Project Revenue and debit to Unbilled Accounts Receivable when a cost is posted to the project.  The amounts posted to these accounts are determined based on the budgeted profit type and project amount/percent.  Then when the expenses are billed, the amounts move from unbilled to actual revenue and AR.  At any point, the unbilled AR and unbilled revenue represent your accrual.




Of course, this works great for expenses, but what about fees?  And the need to accrue them?  You can use this same logic, but instead of using a fee - we use a zero cost miscellaneous log.  So you set up the miscellaneous log cost category with no cost, a quantity equal to the total amount of fee (if known), a profit type of billing rate, and a billing rate of $1.  Yes, $1. So then, at month end, you can enter a miscellaneous log with zero cost, where the quantity is equal to the amount of the fee you want to accrue.  It will then calculate the accrued revenue and accrued AR for the GL posting for you.  This is particularly helpful when the accrual of the fee is not predetermined, but rather calculated based on criteria outside of the system (project milestones, etc). 



The neat part of handling the fees as zero cost miscellaneous logs is that it them goes in to the WIP queue for billing at the amount you want to bill.


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.




Unable to run SQL query on GP data with dates - Conversion of a varchar to a datetime error

$
0
0
By Steve Endow

An interesting Dynamics GP query question came up on Experts Exchange.  The user was asking how to select transactions for a given date range from the PM30300 table.  Pretty straightforward--I recommended this query as a start:

SELECT * FROM TWO..PM30300 WHERE DOCDATE BETWEEN '2017-01-01' AND '2017-03-31'

The user tried it, but said that he received the following error:

Msg 242, Level 16, State 3, Line 1
The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.


Puzzling.  I asked him to run this sp_help statement to verify that the GP PM30300 table did have the correct datetime data type for the DOCDATE field.

EXEC sp_help PM30300

The table looked fine and the DOCDATE field was set as a datetime data type.

DOCDATEdatetime


Very odd.  So the table looks okay.  The query looks okay.  And I assumed the data in the DOCDATE field was okay.

So why would a query filtering dates give a data type error?

I looked at the error again.  Conversion of a varchar to a date time.  When we use a date filter of '2017-01-01', that is a string, and SQL Server is implicitly converting that to a datetime data type.

So that means that for some reason, when the user sent in the value of '2017-01-01', SQL Server failed to convert it to a datetime.  But that date format obviously works for me, so why wasn't it working for him?

Enter the mess called regional settings.  Start by running this statement to view the SQL User Options settings.

DBCC USEROPTIONS

In the results, look at the language and deateformat values.


My settings showed a language of us_english, and a dateformat of mdy.  So with this setting, SQL Server is able to apparently implicitly convert the '2015-07-01' date value to the mdy date format.

But then I ran this statement against the sa login:

ALTER LOGIN [sa] WITH DEFAULT_LANGUAGE = [British];

This change only takes effect the next time you connect, so you have to close your query and open a new query.  When I do that, here is what I see.


In addition to changing the language, the dateformat changes.

It is possible to change the dateformat value directly, but that change will only persist for the active connection. Once the connection is closed and recreated, the setting will default back to the user options value.

So now that I have set my language to British, which has a dateformat of dmy, what happens when I run my simple query with a date filter?


There ya go.

So it would seem that the user has a language setting other than us_english for their SQL Server login, and that language in turn as a dateformat other than mdy.

The simple fix would be to just run this statement for whatever login is having the issue:

ALTER LOGIN [sa] WITH DEFAULT_LANGUAGE = [us_english];

This sets the language to us_english and the dateformat to mdy.  Once that setting is changed and you reconnect to SQL, you should be able to query with a date format of '2017-01-01'.

The potential downside is that there may be other applications that rely on the language value, and dateformat, that may break if you change to us_english.  If you are unable to change the default language setting, you have two options.

You could potentially change the date format you can use in your queries:

SELECT * FROM TWO..PM30300 WHERE DOCDATE BETWEEN '15-01-2017' AND '31-03-2017'

The problem with this is if you have users with different language settings.  If a us_english user tries to run this query, it will fail with the same 'conversion of varchar' error.

Another option is to explicitly cast your date values to datetime:

SELECT * FROM TWO01..PM30300 WHERE DOCDATE BETWEEN CAST('15-01-2017' AS datetime) AND CAST('31-03-2017' AS datetime)

But in my test, even this does not work for users that have different language and dateformat settings.

As a last resort, it looks like this option would work:

SET DATEFORMAT mdy

This statement would have to be run before every query, as it only persists during the connection.  But it ensures that you are using a known date format for all queries.

This is the first time I've run into this, but since most of my customers are in the US, it isn't too surprising.  

Consultants in other countries may run into this regularly.

Now if only the rest of the world could see the error of their ways and finally start using mdy, inches, ounces, pounds, miles per hour, etc.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Musings on some oddities of software purchasing habits

$
0
0
By Steve Endow

In addition to developing Dynamics GP integrations and customizations, I sell a few add-on solutions for Dynamics GP, such as my AP Payment & Apply Import, and Envisage Software's Post Master Enterprise.

Over the last 6 years selling software, I've noticed that people seem to behave differently when purchasing software versus consulting services.

Here are a few observations I've made.

1. Patience:  When people are shopping for, evaluating, testing, and purchasing software (versus consulting services), they seem to be more impatient.  If I am unable to respond to an email or inquiry the same day, I've noticed that many people get impatient and send additional messages or web site inquiries.  "I emailed you yesterday but haven't received any response!".  I try my best to respond promptly, usually within an hour, but sometimes I'm sick, out of the office, travelling, or actually buried deep in code and can't respond the same day.  In general, it seems that people who are working with me on consulting projects are much more patient than the software prospects and customers.  I guess this can be attributed to the prevalence of online shopping for just about everything, and Amazon's same-day and next-day shipping have further heightened our expectations for immediate delivery of products.

2. Trial license key versus final license key:  I always provide trial license keys to customers to allow them to fully test the software before they purchase.  Usually this works out fine, and the customer is able to use the software with their trial license key while they process a payment.  They then receive their final license key before the trial expires, and they have uninterrupted use of the software.  But, somewhat related to point #1 above, there are occasionally customers (and sometimes partners) who are surprisingly eager to get the final license key.  Even though they have 20 days left on the trial, they will suddenly ask to pay for the software ASAP and get the final key ASAP.  I don't mind processing the payment quickly, but these requests puzzle me.  I can only assume that there is some psychological component about a trial vs. final license key that causes this?

3. Credit cards:  Since I started Precipio Services 8 years ago, I haven't had a single customer ask to pay me for my consulting services with a credit card.  Zero.  And I've only had one partner pay me via ACH.  But when it comes to software, a majority of the purchases are paid by credit card.  I have some customers that have purchased both services and software from me--they pay for my services with a check in the mail, but they want to purchase the software with a credit card.  It seems there is a different psychology about how people pay for things vs. services, or perhaps how people purchase software.


I just pulled some payment history, and see that almost 30% pay by check, 7% by ACH, and the rest with a credit card.  Breaking things down a bit further, of customers who purchased the software directly, only about 7% paid with a credit card.  Partners were the exact opposite--only 7% paid with a check--the rest used a credit card.  So that's interesting--GP partners are the primary drivers of the credit card purchases.  But based on my experience, they never pay for consulting services with a credit card.


As a result of these behaviors or trends, I've had to adapt my processes and systems.  I now respond to all of the software related inquiries to get them out of the way first, and then I have to use the remaining time to get my consulting work done.  This is often a challenge and makes it harder to plan my consulting work since the software inquiries and support requests can vary so dramatically from day to day.

I started accepting credit cards in 2010 to accommodate all of the requests, and this year I finally added a payment page to my web site so that customers can pay online without having to fill out a form or call me to process the transaction.  Accepting payments on my web site has been a big hit--customers can submit the payment in under a minute and receive their final license key shortly after. While by no means revolutionary, it seems to be somewhat progressive for the Dynamics GP marketplace.

Anyway, just some observations that I thought were interesting.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Dynamics GP Server field is blank due to changes in Dex.ini file

$
0
0
By Steve Endow

A customer was having an issue with her "Server" field being blank when she launched Dynamics GP.


I previously wrote about a similar situation where a space at the beginning of the DSN name could cause this issue.  But in the case of this client, the DSN does not have any spaces at the beginning of the DSN name.

Yesterday I had a call with the customer and we resolved the issue by launching GP, selecting the Dynamics GP 2013 "Server" value, logging in to GP, and then closing GP.  The next time she launched GP, the Server field had a default value.  Problem solved.

But she emailed me again today, saying that the problem came back--the Server field was blank again.

I asked her to email me her Dex.ini file.  When I reviewed the file, I found this:

SQLLastDataSource=DYNAMICS GP 2013

There was a value for the last data source, but it was in all caps, which seemed odd.

I opened the Dex.ini on my GP server and changed the SQLLastDataSource to all caps and launched GP.  And here is what I saw.


Sure enough, that was the cause of the problem.

So it seems that in addition to the GP login window not liking spaces, the Server field value that is read from the Dex.ini is case sensitive.  If the window doesn't find an exact match in the list of DSNs, including capitalization, it will leave the Server field blank.

Okay, so I can understand that.

But what is weird is that the customer's Dex.ini file is having an all caps value being saved to the SQLLastDataSource setting.  That seems really strange.

There is only one DSN setup on that particular customer machine, and only "Dynamics GP 2013" shows up in the Server drop down list.  So that would seem to indicate that another machine is using the Dex.ini file and writing the all caps DSN name to the file.

So does that mean that some other machine is actually launching GP from a network share?  Or is there some other way that another GP machine can utilize a shared Dex.ini?  I wouldn't think so, but apparently it is happening somehow.  Very odd.

I've asked the customer and their partner to try and track down the problem.

UPDATE: The client informed me that rather than determine the cause of the Dex,ini changes, they have just changed the DSN on the machine to be all caps to match the mystery value that is being written to the Dex.ini.  So I guess I'll never know the cause...


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Design Specifications- What, How, Yet, How Much?

$
0
0
I know we have all been there.  You (or your team member) create a fabulous customization, report, or interface.  Or, at least you think it’s fabulous.  But once it’s delivered to the client, the issues begin…

  • -          It doesn’t work the way it we thought it would
  • -          We need this additional work for it to meet our needs
  • -          We thought we already explained this

And the list goes on and on and on.  I had a webinar last week for GPUG about creating specifications for customizations, reports, and interfaces, and I thought I would share some of that content here as well.  I continually am reminded of the importance of the spec process, it seems like every week.  And if I let it slide, either because I think it’s not a big deal or the client is resistant to committing to a spec document for whatever reason, the universe almost always course corrects me at some point in the project (e.g., budget overrun, client dissatisfaction, etc).

So we all know the reasons to create specs, right?
  • -          Reduce risk
  • -          Ensure clear communication and expectations
  • -          Accurately scope work
  • -          Create a better outcome in terms of design and client response

But what are the key components of a successful spec?  Now, I completely understand that this list will vary based on the client, the developer, and the specifics of the customization/report/interface being addressed.  This list simply serves as a starting point checklist of the normal “must-haves”:
  • -          WHAT is the spec for? A functional/plain language description of the problem to be solved.   Visual flowcharts/mockups as necessary to make the point.  Include any caveats/assumptions/risks to be clear up front.
  • -          HOW do you plan to address the need? The technologies/tools/methods to be used, including prerequisites for deployment.   At a minimum a high level technical design, but could be more detailed including fields, calculations, and needed logic.  Whether it is high-level or detailed depends on what you need to be able to accurately scope it out, and how involved the project may be.  If it is a large project, the initial spec may have a high  level technical design with a statement that a full technical design is included in the estimate and may change. On smaller projects, a full technical design may not take much time and allow you to better scope/estimate the project.
  • -          YET, what might be the hurdles? Always note any exceptions, assumptions, or outstanding questions that might impact the estimate and/or complexity of the project. It’s okay to have outstanding items, as long as they are documented and ultimately addressed in the project.
  • -          HOW MUCH will this cost?  And don’t just focus on development costs, make sure you include project management, design, testing (unit, function, process, and supporting end user acceptance testing) in your estimate.

I love getting all of this down on paper, I just do.  And it is great when a client fully commits to the process, and we can make sure we are all on the same page before work starts.  But, as with anything in this line of work, you can’t make the perfect the enemy of the adequate at times.  So although a spec is important, don’t let it become the bat with which you bully others (either your team members or the client).  Try to remember the intention of the spec, and approach it with good faith.   Things may change as the project proceeds, and the client may grow in their understanding of their own needs.  Of course, assess the impact of these changes on your budget and timeline, but also know that a minor change with no measurable impact can be addressed without a lot of hub-bub (assuming that the necessary decision makers are aware).

Naturally, everything above is just my opinion.  I would love to hear from you all regarding your “must-have’s” for successful spec documents, and your challenges (and successes) with applying these guidelines to projects.


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.


eConnect error: hexadecimal value 0x00, is an invalid character

$
0
0
By Steve Endow

A client emailed me regarding the following error they were receiving with a Dynamics GP 2010 eConnect SOP Invoice integration.

eConnectException: '.', hexadecimal value 0x00, is an invalid character. Line 5, position -248.

Based on the error message, it appeared that there was some type of invalid character in the CSV source data file.  But when we looked at the CSV file in a text editor, everything looked fine.

So I ran the integration in debug mode in Visual Studio, and when the eConnect insert failed, I checked the XML data that was submitted.  The results were interesting.

Trans Number:         205112482
Borrower:             THOMAS JEFFERSON
                      VIRGINIA COLONIES
Title Company:        & #x0 ;
Commitment Number:    & #x0 ;
File Name:            & #x0 ;
Lien Type:            FIRST
Loan Type:            CONVENTIONAL

(I've had to add spaces to the hex string so that Blogger doesn't strip them out)

The text above is a long comment string that is being inserted with the SOP invoice header.  It seems that there is an odd hex character in three of the field values.

I had to look up that hex character, and learned that it is the hex value of the null character.  Apparently eConnect is not a fan of the null character, so that is causing the error.


This is all that I see when I open the file in UltraEdit and view in text mode.  Looks fine.

But if I view the file in the UltraEdit HEX mode, I see this.


Notice the "00" values in the data on the left, and the period in the representation on the right.  That's our problem.

So how do we deal with these rogue null characters?  Ideally, the source data file would be corrected so that they are not inserted in the file in the first place.  But in case they do show up in the data file again, it's fairly easy to remove them using Regular Expressions.

So I added this simple function to strip the null character from the field value.

publicstaticstring RemoveNull(stringinput, string replaceWith)
{
    string output = Regex.Replace(input, "\x00", replaceWith);
    return output;

}

Once I applied the RemoveNull function to the invoice comment text, the invoices imported fine and the error went away.

I've never encountered the "null" character before, but fortunately it was a pretty easy fix.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter








Turn off all Dynamics GP posting reports in the Fabrikam test company

$
0
0
By Steve Endow

It's pretty sad when you know how many times you have to click the cancel button on the report dialog boxes after you post a certain Dynamics GP batch.

Finally, my laziness wanting to avoid constantly clicking a Cancel button finally overtook my laziness to disable all posting reports, and I finally spent 10 seconds to create the script to turn off all posting reports in Fabrikam.

UPDATETWO..SY02200SETPRNTJRNL= 0


There.  Done.

I now need to make it a standard part of the GP installation process after I create the Fabrikam company.

Batches and posting reports...two anachronisms of the ERP world that GP will probably never abandon...

Ask a NetSuite consultant about their "batches" and "posting reports" if you want to see some puzzled reactions.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Microsoft Dynamics GP 2016 should go meatless

$
0
0
By Steve Endow

Roving Dynamics GP reporter Tim Wappat travelled to France today to get a sneak peek at a very early version of Dynamics GP 2016.  He posted an article on his blog:

http://www.timwappat.info/post/2015/09/15/GP2016-looking-sweet-in-HTML5

One of the interesting side notes that he mentions is that the "hamburger icon", or "hamburger menu" was being used, for the time being, in the pre-release GP 2016 web client.


The demo that Tim saw was obviously pre-pre-pre-Release, and everything is subject to change, but if Microsoft is even considering using the hamburger menu, I beseech them to reconsider.

I admit that I didn't know the name of that three-bar icon until a few weeks ago--which is when I read this very convincing article on why the hamburger menu is a horrible UI element.

http://deep.design/the-hamburger-menu/

The article is very convincing in its explanation of why the hamburger menu is detrimental to application navigation, and then it goes on to cite numerous examples and statistics to support its assertion.

Here is another article with the same conclusion and more stats:

https://redbooth.com/blog/hamburger-menu-iphone-app

After reading about the design deficiencies, I now notice how annoying the hamburger menu is on my iPhone apps--I am looking for settings and options and features, but can't find them--until I realize there is an innocuous little three line icon in the corner that I have to click to expand a menu.  It's a great way to hide features so that your users never use them.

While there may be some situations where limited use of the hamburger menu may make sense--rarely used settings or infrequently used windows, it definitely should be used very sparingly.  But based on the articles above, it should be replaced with some other menu design.

Save the cows.  Get rid of the hamburger.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Project In One Company, Not All Companies

$
0
0
So, back in the day, Project Accounting had its own purchase order window.  And we all moaned and complained about this.  And we rejoiced when it was combined in to the standard purchase order window.  But little did we appreciate the complexities that this would bring. 

These complexities are illuminated when you try to use Project Accounting in one company but not all companies in your installation. This post focuses on the issues you will encounter in non-project accounting companies. Once Project Accounting is registered, you will begin receiving messages when accessing the Purchase Order Entry window, like 'Purchase Order Processing Setup Information is Missing or Damaged'.  Ugh.  And then, once you resolve that, every vendor you select when entering a purchase order will be greeted with 'Project Accounting information for this vendor does not exist. Do you want to add the vendor's project information?'.  Double ugh.  And keep in mind, these errors are encountered in companies where users DO NOT have access to the alternate project accounting windows.

On a side note, I am not exploring the Dynamics.Set file hack (having a separate non-Project Set file) in this post, as I generally am not a fan of separate Set files due to the potential for confusion and issues.  But to each their own :)

So, what can you do...

For the first message related to Purchase Order Processing Setup, you will need to (at least temporarily) grant access to the alternate Purchase Order Processing Setup window for Project Accounting (Setup-System-Alternate/Modified Forms and Reports ID).  Once you have access, go to Setup-Purchasing-Purchase Order Processing.  Click the Project button, click OK, and close the Purchase Order Processing Setup window.  At this point, even if you remove access to the alternate setup window, the first error messsage regarding setup information will be resolved.

But the vendor message will persist.  And yes, as I mention above, you can have users simply say NO and continue on with the entry.  But the pop up is indeed annoying.  So you have a few options...

First, you have to address all existing vendors at the time of registration of project accounting.  To do this, you can check out the script here that will do it automatically:

http://kbase.dyntools.com/home/project-accounting/project-accounting-information-for-this-vendor-does-not-exist)

Then, you need to decide how you want to handle new vendors added to GP.  This is where the options come in to play.

Option A- Grant access to the alternate Vendor Maintenance window for Project Accounting.  This would be the ONLY alternate project accounting window that the users will need to access.  This way, when they save a new vendor, it will automatically save the project info as well (which will prevent the prompt to add project info when entering a purchase order for the vendor).

Option B- If Option A is not possible (for example, if you have another product that has an alternate Vendor Maintenance window as well that users need to access), then you can either schedule the script to create the project info to run every evening or look at some VBA code to populate the project info even when the alternate window is not used.

Please feel free to other options/workarounds you have found as well to avoid these headaches!



Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Finding batches with errors across all Dynamics GP company databases

$
0
0
By Steve Endow

A customer contacted me with an interesting problem.  They utilize Post Master Enterprise to automatically post batches in over 150 company databases.  The automatic batch posting is working fine, but they occasionally have some batches with errors that go into batch recovery.  Post Master sends them an email message for each batch that goes into recovery, but with over 150 company databases, they wanted a way to generate a list of all problem batches across all of their company databases.

I haven't done a ton of research into batch recovery and how GP detects which batches to list in the batch recovery window, but based on a quick check of some failed batches in Fabrikam, it looks like the BCHSTTUS field in the SY00500 table was a good place to start.

This KB article lists the different values for the BCHSTTUS field.

https://support.microsoft.com/en-us/kb/852420


So now we can create a query like this:

SELECT * FROM TWO..SY00500 WHERE BCHSTTUS > 6


That's a start, but it isn't a great solution if we need to run the query in over 150 different databases.

So after some digging, I found this StackOverflow thread and used the last suggestion on the thread.


CREATE TABLE #tempgpquery
(
[DB] VARCHAR(50), 
[Records] INT
)

DECLARE @db_name varchar(10)
DECLARE c_db_names CURSOR FOR
SELECT INTERID FROM DYNAMICS..SY01500

OPEN c_db_names

FETCH c_db_names INTO @db_name

WHILE @@Fetch_Status = 0
BEGIN
  EXEC('
    INSERT INTO #tempgpquery
    SELECT ''' + @db_name + ''',COUNT(*) FROM ' + @db_name + '..SY00500 WHERE BCHSTTUS > 6
  ')
  FETCH c_db_names INTO @db_name
END

CLOSE c_db_names
DEALLOCATE c_db_names

SELECT * FROM #tempgpquery

DROP TABLE #tempgpquery


It looks complex due to the temp table and cursor, but it's actually a fairly straightforward query.

You can modify the query to do whatever you need, and in this case it just does a count of records in SY00500 where the batch status is greater than 6 (the red text).  I queried the list of valid company database names from the SY01500 table, and use that list in a cursor to loop through each database and query it.

It seems to work very well.

But like I said, I'm not 100% sure if the Batch Status field is the only indicator of batch recovery, so if anyone has more info on the query to properly detect batches that have gone to recovery, please let me know.


UPDATE:  The very clever Tim Wappat took up the challenge to find a simpler and cleaner way to perform the query.  He uses the novel approach of building a UNION ALL statement that is replicated through a join against sys.databases.  The results are the same, but his query avoids the use of both the temp table and cursor.  It is rather compact, which makes it a little more difficult to decipher, but it is pretty elegant.  For his superior submission, Tim wins 100 Internet Points.


DECLARE @sql NVARCHAR(MAX);

SET @sql = N'DECLARE @cmd NVARCHAR(MAX); SET @cmd = N'''';';

SELECT @sql = @sql + N'SELECT @cmd = @cmd + N''UNION ALL
SELECT ''''' + QUOTENAME(name) + ''''', COUNT(*) FROM ' 
  + QUOTENAME(name) + '.dbo.SY00500 WHERE BCHSTTUS > 6 ''
WHERE  EXISTS (SELECT 1 FROM ' + QUOTENAME(name) 
 + '.sys.tables AS t
 INNER JOIN ' + QUOTENAME(name) + '.sys.schemas AS s
 ON t.[schema_id] = s.[schema_id]
 WHERE t.name  = N''SY00500''
 AND s.name  = N''dbo'');'
FROM sys.databases WHERE database_id > 4 AND state = 0;

SET @sql = @sql + N';
SET @cmd = STUFF(@cmd, 1, 10, '''');
PRINT @cmd;
EXEC sp_executesql @cmd;';

PRINT @sql;
EXEC sp_executesql @sql;



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Oddities- Missing Project Accounting Accounts

$
0
0
So we all know (or are constantly reminded) that the simplest answer is usually correct.  So when a client recently called to report that their project accounting accounts were missing (PA43001), we immediately started brainstorming "simple" explanations...

1. Maybe they aren't really "gone" (no, they really are gone)
2. Maybe somebody removed them (but how, I mean it's just the COGS accounts and they are simply gone)
3. Maybe someone working on another support case inadvertently removed them (not likely, as they were there at 2pm and gone at 4pm, and no cases were being worked in that time)

To be noted, the client is on GP2013 R2/YE 2014 (12.00.1801).

My coworkers sometimes give me a hard time because the last thing I tend to consider is an actual bug in the software.  The reason I avoid this explanation is that it is too easy, and often is not the case. And particularly when data just disappears with no other process/issue/explanation, it doesn't seem likely that the software just decided to dump the data without provocation.  So I like to make sure we exhaust other routes.  So we went about fixing the issue in this case, restoring the accounts, but it was still bothersome that we could offer no explanation as to why it happened (again, racking my brain on a simple answer).

So we start a case, and found out that this is indeed a quality report (#9120 to be exact).  An apparent bug in GP, that several clients have reported but Microsoft has been unable to replicate.  Odd. Very odd.  The good news being two-fold- first, of clients who have reported it, no one has had a second instance of it.  And second, Microsoft GP support has a script you can run to create a shadow table that will track the project posting accounts table so you can monitor if something were to recur.

What's the lesson in this?  The simplest explanation may just be that it is indeed a software bug.  I guess that's it?

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

GP 2015 VBA UserInfoGet "Permission Denied' error resolved in hotfix and 2015 R2

$
0
0
By Steve Endow

I don't do much Modifier & VBA anymore (mostly VS Tools for customizations), and have only had a handful of customers upgrade to GP 2015, which is probably why I haven't run into this error until today.

While testing an upgraded VBA customization on GP 2015 today, I received this error.

Run-time error '70':  Permission Denied

When I clicked on Debug, I was taken to the call to UserInfoGet.CreateADOConnection.


Thankfully there is a Community Forum post on this error, and the discussion indicates that this was an issue with GP 2015 RTM.  A post on that thread indicates that GP 2015 version 14.00.0661 resolves the issue, and I see that I even posted to the thread confirming that the problem was resolved with 661.

But today the problem occurred for me while running 0661, and I just confirmed that my client is running 0661 and is receiving the error.

After upgrading to the October 2015 Hotfix (KB 3089874, 14.00.0817, post R2 update), the error went away and my VBA code and UserInfoGet calls work fine.

So it seems that somehow the problem can still occur with version 14.00.0661, but is resolved in a newer hotfix.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Dynamics GP "new code" message appears every time you launch GP

$
0
0
By Steve Endow

This week I upgraded and messed with my GP 2015 installation as part of a support case.  At some point, I started to get the "New code must be added" dialog box every time I launched GP.


I clicked on Yes and then logged into GP, but it appeared the next time I launched GP.  I tried Run as Administrator, clicked Yes, and logged in.  But it still kept appearing.

I finally stopped and thought for a second.  I searched my GP 2015 directory for *.cnk files.  And there I found tautil.cnk.  I don't know how it got there or why it was there.  I already had the TAUTIL.DIC dictionary in the directory, and had it listed in the Dynamics.set file.

My only guess is that perhaps the dictionary was updated as part of the hotfix I installed, and for some reason it failed to unchunk.

I deleted the tautil.cnk file and the New code message went away.

Problem solved, mystery remains.

UPDATE: A reader makes a good point, which I also pondered later, that the better approach is to probably rename the existing dictionary and let GP unchunk the cnk file, in case the dictionary was not successfully updated.  In my case, since it was only PSTL / TAUTIL, and it was on my development machine, it wasn't a concern, but in a real environment, renaming the existing dictionary is probably the more prudent approach.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Viewing all 402 articles
Browse latest View live


Latest Images