MSDyn365FO. Electronic Reporting. Parse a text file with different record types.

In the previous blog post series we learned how to import simple CSV file. However, CSV and other text files may contain records of different types that should be imported to different tables or process differently. In this post we will enhance format and mapping created. Let’s say in our example first column represents records type (a, b or c):

CSVImportTestFile

In this case we need to change format and add new “CASE” element:

FormatCase

And 3 record sequences instead of one:

FormatCaseRecords

Note that each sequence has a predefined value for the String field, that’s how we tell ER that if record has “a” in the first column it should be parsed with RecordA sequence. Also, we changed Multiplicity to “One many” for the sequences, to tell ER that there is at least 1 record for each type. It could be set to “Zero many” if a record is optional or “Exactly one” if we expect it only once in a file.

Now we need to change mapping. Each sequence has system field “isMatched” that is populated if file record is matched to sequence pattern. We will use it to bind 3 record types to same model field, but in a real life examples different records may go to different tables, like header and lines.

MapModelToFormatCase

Expression used is pretty simple, it takes value from the first record if it is mapped, if not, it checks second record and if it is not mapped as well it takes value from the third.

IF(@.Types.RecordA.IsMatched, @.Types.RecordA.Data.FieldInt, IF(@.Types.RecordB.IsMatched, @.Types.RecordB.Data.FieldInt, @.Types.RecordC.Data.FieldInt))

Also using IsMatched you can provide default values when it is missing in a file.

Advertisements

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 4 – Map Model to Datasource.

That’s the last step. We have Format to parse data to a model, now we need to write data from model back to actual tables. Open model designer and click “Map model to datasource” in the action pane.

Create new record, set name and description. Select direction, for import it should be “To destination”:

MapDatasourceMapping

Open designer. In the Data Sources area add data model:

MapDatasourceMappingAddModel

Set name, select model and definition:

MapDatasourceMappingAddModelProperties

Add new destination:

MapDatasourceMappingAddDestination

Select table you want to insert to, set name, specify if it should support insert or update\insert and if you need validation to be triggered:

MapDatasourceMappingAddDestinationSelectTable

Bind Record List from model to tables’ record list and bind field accordingly:

MapDatasourceMappingBind

Save and we are ready to test it!

Go back to Model to Datasource mapping screen, select mapping record and click run in the action pane:

MapDatasourceMappingRun

Select file and check the result. I do not have UI for this table, so just going to use table browser:

ImportCSV_TableBrowser

It’s not that nice to trigger import from the mapping record but that’s the only way I know and in the next post we will look how to trigger it from X++ code, so we could have a button somewhere.

As you can see ER is a really powerful tool that allows users to do tasks that were not possible without development before.

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 3 – Map format to model.

In this blog post we will create new mapping to map format to model. On the format designer screen, click “Map format to model” button. Create new record, select model definition, specify name and description:

MapModelToFormat

Open designer. In the designer bind Record List from the format to Record List in the model and then bind fields accordingly.

MapModelToFormatBinding

Finally, we can test our format. For the test I’m going to use simple CSV file:

CSVImportTestFile

Go back to Model to Datasource mapping form, click run in the action pane and upload the file.

MapModelToFormatTest

If we’ve done everything right, we will get XML file that contains data mapped to model or errors, if any:

MapModelToFormatTestResult

At this stage we have Data Model, Format and Mapping that we’ve tested. In the next blog post we will do the last piece of the setup – map Model to Destination and test the whole import.

 

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 2 – Format.

In this blog post we will create new Format. It represents document schema and is used to parse it. Go to Organization administration > Workspaces > Electronic reporting, select Data Model created in the previous post and create new configuration:

CSVImportFormat

In the format designer Add root –> File:

FormatRoot

Add sequence and set delimiter to New line – Windows (CR LF). It will tell ER that file has lines split by CR LF. It is possible to select CR for Mac or LF for Linux or specify a custom delimiter.

FormatRootSequence

Add new sequence. This sequence will represent lines. Set Multiplicity to “One many” to say that at least one line is required.

FormatRootLine

Add another sequence. It will represent individual lines. Set delimiter to ‘,’, to split fields by comma. Use another delimiter, if required.

FormatRootRecord

Add 3 fields:

FormatFeilds

In the end you should have format like this:

Format.png

In the next post we will map format to model and test it!

 

 

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 1 – Data Model.

In this post series I will show how to use Electronic Reporting (ER) to import a CSV file. This tool allows us to create new import process without a line of X++ code. It can be maintained by end users without developer’s help and does not require deployments, because can be easily transferred by XML export and import between environments.

I used documentation available, but it does not have enough details, so here I will try to explain the process step by step. We will start from a Data Model creation and go thought all the stages below:

er-overview-import.png

For simplicity, I’m going to use a custom table that has 3 fields: String, Real and Int.

TestImportTable.png

Data Model is an abstraction over destination\source tables and could be used by multiple different formats. In our case, it will be similar to destination table because it’s quite simple. To create it, go to Organization administration > Workspaces > Electronic reporting.

CSVImportModel

In the designer create model root node:

ModelRoot

Add Records List:

ModelLines.png

Now add 3 fields, one for each in a source file:

ModelFIelds

In the end you should get this structure:

DataModel

Change model status from Draft to Completed, it is required for next step.

In the next blog post we will create new Format.

MSDyn365FO. Add postTargetProcess method to a Data Entity via extension.

Quite a while ago I wrote a blog post about postTargetProcess method that could be added to a data entity for post processing. Previously, you could add it only to a newly created entity, but now you can extend any entity using CoC.

This method is a bit special because it is not declared on the parent object, but is called via reflection by DMFEntityBase class. Full list of method that work in the same way:

  • defaultCTQuery
  • copyCustomStagingToTarget
  • postGetStagingData
  • preTargetProcessSetBased
  • postTargetProcess

DMFEntityBase uses tableHasStaticMethod() to check if a method is defined and DictDataEntity.callStatic() to call it. In the latest PUs Microsoft added support of CoC methods to Dict* API and reflection methods, so now it is possible to use CoC method in this scenario as well.

Extension is quite simple and straightforward, all you need to do is to declare static method with respective signature and do not call next because this method is not defined on the parent object:

[ExtensionOf(tableStr(MyDataEntity))]
final public class MyDataEntity_Extension
{
    public static void postTargetProcess(DMFDefinitionGroupExecution _dmfDefinitionGroupExecution)
    {
        //do stuff
    }

}

MSDyn365FO. AX 2012 data upgrade with virtual companies.

In Dynamics 365 for Finance and Operations virtual companies are deprecated and could not be upgraded according to the documentation that just state this fact without proposing a solution.

If we cannot upgrade, then we need to get rid of them. Here are the high level steps how you can de-virtualize a table. Let’s say that we have virtual company V with two companies: A and B. V has only one table collection with one table for simplicity, let’s call it Table1.

  1. Delete a data from Table1 that belongs to the companies A and B. When you create a virtual company, you may already have some data in the tables you want to share. That data may be orphaned, so we want to delete it to avoid duplicates. Can be done via simple T-SQL script:
    DELETE FROM Table1
    WHERE DATAAREAID = 'A' or DATAAREAID = ‘B’
    
  2. Go to System administration > Setup > Virtual company accounts and delete the virtual company.
  3. Restart AX client.
  4. Insert data from the virtual company to de-virtualized companies via X++ job:
    static void deVirtualizeTables(Args _args)
    {
        DataAreaId          virtualDataAreaId = 'V';
        container           oldCompaniesCon = ['A', 'B'];
    
        VirtualDataAreaList virtualDataAreaList;
    
        void deVirtualizeTable(TableId _tableId)
        {
            int             i;
            DataAreaId      dataAreaId;
            SysDictTable    dictTable = new SysDictTable(_tableId);
            Common          buffer = dictTable.makeRecord();
            Common          newBuffer;
    
            while select crossCompany buffer
                where buffer.dataAreaId == virtualDataAreaId
            {
                for (i = 1; i <= conLen(oldCompaniesCon); i++)
                {
                    dataAreaId = conPeek(oldCompaniesCon, i);
                    changeCompany (dataAreaId)
                    {
                        newBuffer = null;
                        newBuffer = dictTable.makeRecord();
    
                        buf2buf(buffer, newBuffer);
                        newBuffer.doInsert();
                    }
                }
            }
        }
    
        select firstOnly RecId from virtualDataAreaList;
    
        if (virtualDataAreaList)
        {
            throw error('Delete a virtual company first!');
        }
    
        ttsBegin;
    
        deVirtualizeTable(tableNum(Table1));
    
        ttsCommit;
    
    }
    
  5. Delete data from the virtual company. It cannot be done from X++ because you cannot use changeCompany with a company that does not exist. T-SQL:
    DELETE FROM Table1
    WEHERE DATAAREAID = 'V'
    

Now we are ready to run standard data upgrade procedure!