MSDyn365FO. How-to print a report to a byte array.

There are blogs available on this topic, like this one, but they are using lots of internal use only objects, like SRSProxy. It’s a well-known problem and hopefully it will be fixed one day, meanwhile here is another way how to achieve the same without any compile warnings.

First, lets extend SrsReportDataContract and add 2 variables: do not save file flag and byte array.

[ExtensionOf(classStr(SrsReportDataContract))]
final public class SrsReportDataContract_IM_Extension
{
    private System.Byte[] reportBytes_IM;
    private boolean doNotSaveFile_IM;

    public System.Byte[] parmReportBytes_IM(System.Byte[] _reportBytes_IM = reportBytes_IM)
    {
        reportBytes_IM = _reportBytes_IM;
        return reportBytes_IM;
    }

    public boolean parmDoNotSaveFile_IM(boolean _doNotSaveFile_IM = doNotSaveFile_IM)
    {
        doNotSaveFile_IM = _doNotSaveFile_IM;
        return doNotSaveFile_IM;
    }
}

Now, let’s create helper class that will do all the magic required. Solution here is simple, subscribe to toSendFile delegate and if a contact has new flag specified then save report bytes back to the contact and set result to false, so standard code won’t save the file. The class has another helper method that accepts report contract and returns byte array.

public static class SrsReportRunHelper_IM
{
    [SubscribesTo(classStr(SRSPrintDestinationSettingsDelegates), delegateStr(SRSPrintDestinationSettingsDelegates, toSendFile))]
    public static void SRSPrintDestinationSettingsDelegates_toSendFile(System.Byte[] _reportBytes, SrsReportRunPrinter _printer, SrsReportDataContract _dataContract, Microsoft.Dynamics.AX.Framework.Reporting.Shared.ReportingService.ParameterValue[] _paramArray, EventHandlerResult _result)
    {
        if (_dataContract.parmDoNotSaveFile_IM())
        {
            _dataContract.parmReportBytes_IM(_reportBytes);
            _result.result(false);
        }
    }

    public static System.Byte[] printReportToByteArray(SrsReportRunController _controller)
    {            
        SrsReportDataContract reportContract = _controller.parmReportContract();
        SRSPrintDestinationSettings printerSettings = reportContract.parmPrintSettings();
        printerSettings.printMediumType(SRSPrintMediumType::File);
        printerSettings.fileFormat(SRSReportFileFormat::PDF);

        reportContract.parmDoNotSaveFile_IM(true);

        _controller.parmShowDialog(false);
        _controller.startOperation();
                
        return reportContract.parmReportBytes_IM();
    }
}

Usage example

SalesInvoiceController controller = SalesInvoiceController::construct();
                controller.parmReportName(PrintMgmtDocType::construct(PrintMgmtDocumentType::SalesOrderInvoice).getDefaultReportFormat());

SrsReportDataContract reportContract = controller.parmReportContract();
SRSPrintDestinationSettings printerSettings = reportContract.parmPrintSettings();
                printerSettings.printMediumType(SRSPrintMediumType::File);
printerSettings.fileFormat(SRSReportFileFormat::PDF);

SalesInvoiceJournalPrint salesInvoiceJournalPrint = SalesInvoiceJournalPrint::construct();
salesInvoiceJournalPrint.parmPrintFormletter(NoYes::Yes);
salesInvoiceJournalPrint.parmUsePrintManagement(false);
salesInvoiceJournalPrint.parmUseUserDefinedDestinations(true);
salesInvoiceJournalPrint.parmPrinterSettingsFormLetter(
controller.parmReportContract().parmPrintSettings().pack());

Args args = new Args();
args.caller(salesInvoiceJournalPrint);
args.parmEnumType(enumNum(PrintCopyOriginal));
args.parmEnum(PrintCopyOriginal::OriginalPrint);
args.record(custInvoiceJour);

controller.parmArgs(args);

System.Byte[] reportBytes = SrsReportRunHelper_IM::printReportToByteArray(controller);

And then you can convert it to base64 string

if (reportBytes)
{
    using (System.IO.MemoryStream memoryStream = new System.IO.MemoryStream(reportBytes))
        {
            ret = System.Convert::ToBase64String(memoryStream.ToArray());
        }
}

MSDyn365FO. How-to set brand id for a report.

Document branding is a really cool feature, unfortunately it is not installed by default and has only several reports available, so not widely used. However, besides SSRS report designs there is a setup available under Organization administration -> Setup -> Document branding, allowing you to define your brand and set logos, background colors, address and contact information, making your reports more configurable from UI. There is another gotcha here, by default your brand is a company id. But in some cases, you may have different business inside of one company and want to have your brand per business unit, for example.

It’s quite an easy change, all we need to do, is to set  SysDocuBrandContract. parmBrandId(). Below is an example for Sales Invoice report

[ExtensionOf(classStr(SalesInvoiceController))]
public final class SalesInvoiceController_IM_Extension
{
    protected void preRunModifyContract()
    {
        next preRunModifyContract();

        SrsReportDataContract reportDataContract = this.parmReportContract();

        if (!reportDataContract.parmDocuBrandContract())
        {
            reportDataContract.parmDocuBrandContract(new SysDocuBrandContract());
        }

        var brandId = DimensionAttributeValueSetStorage::find(custInvoiceJour.DefaultDimension).getDisplayValueByDimensionAttribute(DimensionHelper_IM::getBusinessUnitDimensionAttribute());

        reportDataContract.parmDocuBrandContract().parmBrandId(brandId);
    }
}

As you can see, instead of getting dimension value by name or backing table id, I have a helper class that retrieves dimension attribute. Usually, it is stored in parameters table and gives you flexibility to change dimension name at any time and does not cause issues if you have 2 dimensions with same backing table, like “From BU” and “To BU”.

Another thing to consider is creation of SysDocuBrandContract. In my example, I populate only brandId, however, you can set branding details per report and per design, so they should be populated as well. Framework does it by default for all preprocessed reports, however, for query-based reports you need to set parmReportName and parmDesignName manually.
You may say that no one does SSRS reports these days and that’s a reasonable remark. ER is great but even there you have SysDocuBrandDetails table available, so it can be used to avoid hardcoded colors and logos!

MSDyn365FO. Add Licenses to Deployable Package during release

Recently, one of the environments I’m working with has been updated to self-service and I cannot install ISV license via deployable package anymore. Luckily there is “Add Licenses to Deployable Package” DevOps task available.

You can use it either in build or release pipeline. I prefer not to touch build and leave it standard, so below I will show one of the ways how to use it in your release pipeline.

First, add license file to a version control. I’m dealing with multiple ISVs, so I created folder “Licenses” under Trunk\Main and added all the licenses there.

Lisenses

Now we need to make those files available for the release pipeline. Under Artifacts you can add another one, select “TFVC”, Set Project and Source

AddAnArtifact

Next step is to add new task

Add Licenses to Deployable Package

Here in search pattern I’m picking all the txt files from “Licenses” folder, so when I get new license all I need is to check it in and rerun a release without even running a full build.

MSDyn365FO. OData integration, how-to set financial dimensions.

As we know, we can add financial dimensions as separate columns to Excel templates, but what about OData integration?

To set dimensions you must assign a value to DefaultDimensionDisplayValue or AccountDisplayValue fields. They can have different names, but essentially, we are talking about DisplayValue field on DimensionCombinationEntity and DimensionSetEntity entities. This value represents all dimensions separated by a delimiter. You have to configure it on the “Financial dimension configuration for integrating applications” form.

For example, we have 3 dimensions: CostCenter, Department, BusinessUnit. Delimiter set to “-“. Values are “A”, B”, “C”, so we need to use “A-B-C” string. However, we do not want to hard-code delimiter or order of the dimensions. We want to make sure that we use only dimensions configured!

There are 2 OData actions available: getSetDisplayValue on DimensionSetEntity entity and getCombinationDisplayValue on DimensionCombinationEntity entity.

getSetDisplayValue accepts two arrays: array of attribute names and array of attribute values. It skips dimensions that are not configured and returns delimited string.

POST /data/DimensionSets/Microsoft.Dynamics.DataEntities.getSetDisplayValue

Body
{
    "_attributeNames" : ["CostCenter", "Department", "BusinessUnit"],
    "_attributeValues" : ["A", "B", "C"]
}

Successful Response:

HTTP/1.1 200 OK
{
    "@odata.context" : "https:///data/$metadata#Edm.String",
    "value" : "A-B-C”
}

getCombinationDisplayValue accepts entity name, dimension field name, two arrays (array of attribute names and array of attribute values) and account type.

POST /data/DimensionCombinations/Microsoft.Dynamics.DataEntities.getCombinationDisplayValue

Body
{
    "_entityName" : "CustomerPaymentJournalLine",
    "_propertyName" : "OffsetAccount",
    "_attributeNames" : ["MainAccount", "CostCenter"],
    "_attributeValues" : ["1000", "A"],
    "_ledgerJournalACType" : "Ledger"
}

Successful Response:

HTTP/1.1 200 OK
{
    "@odata.context" : "https:///data/$metadata#Edm.String",
    "value" : "1000-A"
}

Both methods use DimensionResolver::getEntityDisplayValue() under the hood that replaced AxdDimensionUtil in current version.

MSDyn365FO. Cannot sync entity that uses TableId field in the relations.

Some tables use RefRecId and RefTableId fields to build relations with other tables. Hence building data entities for those tables could be not that straight forward exercise. Let’s take WHSFilterTable as an example. We want to build new entity to show customer filter codes. There is already a standard entity, but we will build new one to illustrate the issue. For our entity will join WHSFilterTable and CustTable. Relations between tables:

WHSFilterTable.jpg

Entity’s query will be similar to this:

CustomerWHSFiltersEntity.jpg

It builds perfectly fine but during DB sync we get huge list of errors. I won’t list all of them here. But main one is “System.Data.SqlClient.SqlException (0x80131904): Invalid column name ‘TABLEID’.“,  that happens during trigger creation.  Sync engine generates SQL statement that has TableId field in it, but TableId is not a real field and does not exist in underlying tables.

So what do we do? Because we need that entity and it’s quite common to use RefTableId, so error is not specific to WHSFilterTable.

Skip triggers

If you look at SQL views generated for entities you could notice that quite a few of them do not have triggers at all. Triggers are generated only for entities that support set base operations, change “Enable Set Based SQL Operations” property to “No”, no triggers will be generated and entity will be synchronized without any issues.

Do not use TableId in the relation

Instead of having relation by tableId we can remove it and add range over RefTableId field. Value for this range could be SysQueryRangeUtil method, that simply returns tableNum(CustTable). Value is calculated only once during DB sync and then embedded into the view definition. And that’s what you can see in the standard WHSCustomerProductFilterEntity entity. It uses WHSCustomerProductFilterEntityHelper::getCustomers() method to resolve this particular issue in the standard product.

Flighting

You can use DbSyncEnableTableIdInDataEntityTriggerRelations flighting

INSERT INTO SYSFLIGHTING (FLIGHTNAME, ENABLED, FLIGHTSERVICEID) VALUES (‘DbSyncEnableTableIdInDataEntityTriggerRelations’, 1, 12719367))

It forces sync to use actual values instead of table ids during triggers generation and looks like it was introduced to deal with this particular issue. I have no idea why it’s not documented anywhere and not enabled by default.

MSDyn365FO. How-to automatically start a build VM for a build

While we are waiting for Microsoft-hosted agents support (build without a VM) to be released, we have to have a dedicated build VM that generally sits there and does nothing. To save costs we can setup a schedule to turn it on and off, however, you have to align your build schedule with it, and it does not make a lot of sense if you do not have active development and do changes once in a while.

Here is a quick and dirty workaround. You can get Azure Virtual Machine Manager extension from visual studio marketplace.

Azure Virtual Machine Manager

It adds one task to stop or start a VM.

Now we need to modify standard build definition and add new task to start build VM.

You cannot add it to existing agent job, because this one is self-hosted and requires VM to be up and running, catch 22! So, add another agent job:

MSHostedAgentJob.jpg

Bring it forward and add Azure VM task:

AzureVMStart.jpg

Now your pipeline should look like this:

BuildPipeline

And to make it work we need one more change. In the build options it demands DynamicsSDK and obviously new agent won’t have it and will fail, so I simply removed DynamicsSDK from the list, that’s why I call this quick and dirty!

BuildDemand

To stop a VM after the build I put Azure VM task in the very beginning of the release pipeline that is automatically triggered after a successful build.

ReleasePipeLine.jpg

Using this neat extension, we can automatically start a VM before a build starts and then immediately turn it off after the build. I deliberately put stop task in the release pipeline, so it won’t stop VM if build fails and I can investigate or quickly rerun it without waiting for VM startup. Obviously, one day we will get ability to use MS hosted agent but meanwhile it may help you to save some $$.

MSDyn365FO. “Failed to publish Power BI Embedded report” error when you activate Embedded Power BI for OneBox.

If you want to enable Power BI Embedded for your OneBox (Tier 1) environment follow this outstanding guide . However, It may not work for you and you can get “Failed to publish Power BI Embedded report” or “LoadReportFailed” errors. One of the possible reasons (if you followed the guide precisely and did not mess with the setup) is deprecation of Workspace Collections.  The good news is that there is a work around. Azure subscription with workspace collection should be connected to LCS implementation project and that’s it! Please note that it’s not necessary to have FnO VM or Azure SQL Server in the same subscription with workspace collection, so workspace collection can be deployed to implementation project subscription and VM with DB can sit in your demo subscription, even in different Azure region.

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 5 – Run import from X++ code.

In previous 4 parts I showed how to build new format without a line of X++ code, but unfortunately to run it you must go to ER workspace. However, it is possible to run it from X++ as well, code below could be easily invoked from a new button to give users better experience.

var runner = ERObjectsFactory::createMappingDestinationRunByImportFormatMappingId(_mappingId, _integrationPoint);

runner.withParameter(inputParameters); //optional, to pass parameters if required

runner.run();

MappingId is import format configuration. Usually, you have a parameter where a user can select configuration, for example, import format configuration field on bank statement format table.

IntegrationPoint is a string value that you populate on the model mapping designer screen:

ModelMappingDesigner.jpg

Usually, format is classStr(ERTableDestination) + ‘#’ + tableStr(TargetTable)

InputParameters is an instance of ERModelDefinitionDatabaseContext class, where you can add parameter name and value pairs:

ERmodelDefinitionInputParametersAction inputParameters = new ERmodelDefinitionInputParametersAction();
inputParameters.addParameter('$ExecutionID', _executionID).addParameter('$AccountId', bankAccount);

MSDyn365FO. Electronic Reporting. Parse a text file with different record types.

In the previous blog post series we learned how to import simple CSV file. However, CSV and other text files may contain records of different types that should be imported to different tables or process differently. In this post we will enhance format and mapping created. Let’s say in our example first column represents records type (a, b or c):

CSVImportTestFile

In this case we need to change format and add new “CASE” element:

FormatCase

And 3 record sequences instead of one:

FormatCaseRecords

Note that each sequence has a predefined value for the String field, that’s how we tell ER that if record has “a” in the first column it should be parsed with RecordA sequence. Also, we changed Multiplicity to “One many” for the sequences, to tell ER that there is at least 1 record for each type. It could be set to “Zero many” if a record is optional or “Exactly one” if we expect it only once in a file.

Now we need to change mapping. Each sequence has system field “isMatched” that is populated if file record is matched to sequence pattern. We will use it to bind 3 record types to same model field, but in a real life examples different records may go to different tables, like header and lines.

MapModelToFormatCase

Expression used is pretty simple, it takes value from the first record if it is mapped, if not, it checks second record and if it is not mapped as well it takes value from the third.

IF(@.Types.RecordA.IsMatched, @.Types.RecordA.Data.FieldInt, IF(@.Types.RecordB.IsMatched, @.Types.RecordB.Data.FieldInt, @.Types.RecordC.Data.FieldInt))

Also using IsMatched you can provide default values when it is missing in a file.

MSDyn365FO. How to Import CSV file using Electronic Reporting. Part 4 – Map Model to Datasource.

That’s the last step. We have Format to parse data to a model, now we need to write data from model back to actual tables. Open model designer and click “Map model to datasource” in the action pane.

Create new record, set name and description. Select direction, for import it should be “To destination”:

MapDatasourceMapping

Open designer. In the Data Sources area add data model:

MapDatasourceMappingAddModel

Set name, select model and definition:

MapDatasourceMappingAddModelProperties

Add new destination:

MapDatasourceMappingAddDestination

Select table you want to insert to, set name, specify if it should support insert or update\insert and if you need validation to be triggered:

MapDatasourceMappingAddDestinationSelectTable

Bind Record List from model to tables’ record list and bind field accordingly:

MapDatasourceMappingBind

Save and we are ready to test it!

Go back to Model to Datasource mapping screen, select mapping record and click run in the action pane:

MapDatasourceMappingRun

Select file and check the result. I do not have UI for this table, so just going to use table browser:

ImportCSV_TableBrowser

It’s not that nice to trigger import from the mapping record but that’s the only way I know and in the next post we will look how to trigger it from X++ code, so we could have a button somewhere.

As you can see ER is a really powerful tool that allows users to do tasks that were not possible without development before.